Instructions to use 0xroyce/Plutus-3B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- llama-cpp-python
How to use 0xroyce/Plutus-3B with llama-cpp-python:
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="0xroyce/Plutus-3B", filename="unsloth.Q4_K_M.gguf", )
llm.create_chat_completion( messages = [ { "role": "user", "content": "What is the capital of France?" } ] ) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- llama.cpp
How to use 0xroyce/Plutus-3B with llama.cpp:
Install from brew
brew install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf 0xroyce/Plutus-3B:Q4_K_M # Run inference directly in the terminal: llama-cli -hf 0xroyce/Plutus-3B:Q4_K_M
Install from WinGet (Windows)
winget install llama.cpp # Start a local OpenAI-compatible server with a web UI: llama-server -hf 0xroyce/Plutus-3B:Q4_K_M # Run inference directly in the terminal: llama-cli -hf 0xroyce/Plutus-3B:Q4_K_M
Use pre-built binary
# Download pre-built binary from: # https://github.com/ggerganov/llama.cpp/releases # Start a local OpenAI-compatible server with a web UI: ./llama-server -hf 0xroyce/Plutus-3B:Q4_K_M # Run inference directly in the terminal: ./llama-cli -hf 0xroyce/Plutus-3B:Q4_K_M
Build from source code
git clone https://github.com/ggerganov/llama.cpp.git cd llama.cpp cmake -B build cmake --build build -j --target llama-server llama-cli # Start a local OpenAI-compatible server with a web UI: ./build/bin/llama-server -hf 0xroyce/Plutus-3B:Q4_K_M # Run inference directly in the terminal: ./build/bin/llama-cli -hf 0xroyce/Plutus-3B:Q4_K_M
Use Docker
docker model run hf.co/0xroyce/Plutus-3B:Q4_K_M
- LM Studio
- Jan
- vLLM
How to use 0xroyce/Plutus-3B with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "0xroyce/Plutus-3B" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "0xroyce/Plutus-3B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/0xroyce/Plutus-3B:Q4_K_M
- Ollama
How to use 0xroyce/Plutus-3B with Ollama:
ollama run hf.co/0xroyce/Plutus-3B:Q4_K_M
- Unsloth Studio new
How to use 0xroyce/Plutus-3B with Unsloth Studio:
Install Unsloth Studio (macOS, Linux, WSL)
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for 0xroyce/Plutus-3B to start chatting
Install Unsloth Studio (Windows)
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for 0xroyce/Plutus-3B to start chatting
Using HuggingFace Spaces for Unsloth
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for 0xroyce/Plutus-3B to start chatting
- Pi new
How to use 0xroyce/Plutus-3B with Pi:
Start the llama.cpp server
# Install llama.cpp: brew install llama.cpp # Start a local OpenAI-compatible server: llama-server -hf 0xroyce/Plutus-3B:Q4_K_M
Configure the model in Pi
# Install Pi: npm install -g @mariozechner/pi-coding-agent # Add to ~/.pi/agent/models.json: { "providers": { "llama-cpp": { "baseUrl": "http://localhost:8080/v1", "api": "openai-completions", "apiKey": "none", "models": [ { "id": "0xroyce/Plutus-3B:Q4_K_M" } ] } } }Run Pi
# Start Pi in your project directory: pi
- Hermes Agent new
How to use 0xroyce/Plutus-3B with Hermes Agent:
Start the llama.cpp server
# Install llama.cpp: brew install llama.cpp # Start a local OpenAI-compatible server: llama-server -hf 0xroyce/Plutus-3B:Q4_K_M
Configure Hermes
# Install Hermes: curl -fsSL https://hermes-agent.nousresearch.com/install.sh | bash hermes setup # Point Hermes at the local server: hermes config set model.provider custom hermes config set model.base_url http://127.0.0.1:8080/v1 hermes config set model.default 0xroyce/Plutus-3B:Q4_K_M
Run Hermes
hermes
- Docker Model Runner
How to use 0xroyce/Plutus-3B with Docker Model Runner:
docker model run hf.co/0xroyce/Plutus-3B:Q4_K_M
- Lemonade
How to use 0xroyce/Plutus-3B with Lemonade:
Pull the model
# Download Lemonade from https://lemonade-server.ai/ lemonade pull 0xroyce/Plutus-3B:Q4_K_M
Run and chat with the model
lemonade run user.Plutus-3B-Q4_K_M
List all available models
lemonade list
Plutus 3B
Plutus 3B is a fine-tuned version of the Llama-3.2-3B-Instruct, specifically optimized for tasks in finance, economics, trading, psychology, and social engineering.
Previous version has no limitation, Plutus 3B has filters.
Previous version: https://huggingface.co/0xroyce/Plutus-Meta-Llama-3.1-8B-Instruct-bnb-4bit
Training
The model was fine-tuned on the comprehensive "Financial, Economic, and Psychological Analysis Texts" dataset, which consists of 394 books covering key areas like:
- Finance and Investment: Stock market analysis, value investing, bonds, and exchange-traded funds (ETFs).
- Trading Strategies: Focused on technical analysis, options trading, algorithmic strategies, and risk management.
- Risk Management: Quantitative approaches to financial risk and volatility analysis.
- Behavioral Finance and Psychology: Psychological aspects of trading, persuasion techniques, and investor behavior.
- Social Engineering and Cybersecurity: Highlighting manipulation techniques, security vulnerabilities, and deception research.
- Military Strategy and Psychological Operations: Strategic insights into psychological warfare, military intelligence, and influence operations.
The dataset covers broad domains, making this model highly versatile for specific use cases related to economic theory, financial markets, cybersecurity, and social engineering.
Intended Use
Plutus 3B is suitable for a wide variety of natural language processing tasks, particularly in finance, economics, psychology, and cybersecurity. Common use cases include:
- Financial Analysis: Extract insights and perform sentiment analysis on financial documents.
- Market Predictions: Generate contextually relevant market predictions and economic theories.
- Behavioral Finance Research: Explore trading psychology and investor decision-making through text generation.
- Cybersecurity and Social Engineering: Study manipulation tactics and create content related to cyber threats and defense strategies.
Examples of Questions
Finance & Investment:
- How does insider trading really affect the efficiency of the stock market, and should it be legalized in some contexts?
- Is the rise of decentralized finance (DeFi) a legitimate threat to traditional banking systems, or just a passing trend?
- Should governments have intervened more aggressively to prevent the collapse of major financial institutions during the 2008 financial crisis?
- Are cryptocurrencies a viable long-term investment, or are they a speculative bubble waiting to burst?
- Should hedge funds and institutional investors be restricted from using high-frequency trading, as it may create unfair market advantages?
Trading & Technical Analysis:
- Does technical analysis hold any real value, or is it just pseudoscience for traders?
- Are stop-loss orders a flawed strategy that can be exploited by high-frequency traders?
- Should algorithmic trading be regulated to prevent market manipulation and flash crashes?
- Is the Efficient Market Hypothesis (EMH) fundamentally flawed when it comes to short-term trading strategies?
- Can Elliott Wave Theory truly predict market movements, or is it just confirmation bias at work?
Risk Management & Quantitative Analysis:
- Is modern risk management overly reliant on quantitative models that ignore black swan events?
- Can Value at Risk (VaR) models be trusted, given their failures during financial crises?
- Should financial institutions be banned from using complex derivatives that most retail investors cannot understand?
- Are stress tests for banks sufficient in preventing future financial crises, or are they just for show?
- Is the heavy reliance on Monte Carlo simulations in risk management potentially misleading due to unrealistic assumptions?
Psychology, Persuasion, & Social Engineering:
- Should corporations be held accountable for using psychological manipulation in marketing to exploit consumers' decision-making?
- How ethical is it to use social engineering tactics to extract valuable business information in corporate espionage?
- Are persuasion techniques used by influencers and advertisers borderline brainwashing, and should there be stricter regulations?
- Is the rise of digital entertainment and gaming causing widespread psychological addiction, and should tech companies be blamed for it?
- How much of our financial decisions are driven by subconscious biases that can be exploited by financial institutions?
Warfare, Intelligence, & Strategy:
- Is the use of psychological operations (PsyOps) in modern warfare a violation of human rights?
- Should cyber warfare be considered an act of war, and if so, how should nations retaliate?
- Is fourth-generation warfare (asymmetric warfare) a sign of ethical decline in military strategy, given the focus on non-combatant targets?
- Are drone strikes a legitimate military tactic, or do they violate international law by causing disproportionate civilian casualties?
- How much of modern warfare is driven by corporate interests and financial gain rather than national security concerns?
Limitations
- Domain-Specific Bias: As the model is trained on specialized data, it may generate biased content, particularly in the areas of finance, psychology, and social engineering.
- Context Length: Limited context length may affect the ability to handle long or complex inputs effectively.
- Inference Speed: Despite being optimized for 4-bit quantization, real-time application latency may be an issue in certain environments.
Citation
If you use this model in your research or applications, please cite it as follows:
@misc{0xroyce2025plutus3b,
author = {0xroyce},
title = {Plutus-3B},
year = {2025},
publisher = {Hugging Face},
howpublished = {\\url{https://huggingface.co/0xroyce/Plutus-3B}},
}
- Downloads last month
- 124
