A terminal assistant powered by On-Device LLM
Project description
TBuddy - Terminal Assistant Powered by On-Device LLM
TBuddy is an intelligent terminal assistant that converts natural language queries into bash commands using on-device (<1B params) Large Language Models (LLMs). It provides both a command-line interface and a daemon service for seamless terminal command generation.
๐ Features
- Natural Language to Bash Commands: Convert plain English descriptions into executable bash commands
- On-Device LLM Integration: Uses Ollama with extremely small sub 1Billion parameter local models for balancing privacy, speed, memory usage and accuracy
- Semantic Example Selection: Leverages vector embeddings to find relevant command examples from a pre-curated list
- Dual Operation Modes:
- One-off command generation
- Background daemon service for persistent, faster availability
- Rich Example Database: Comprehensive collection of text-to-command examples (available per request)
- Safe Command Generation: Focuses on standard, secure and safe bash commands
๐๏ธ Architecture
Core Components
terminal-buddy/
โโโ src/terminal_buddy/
โ โโโ main.py # CLI interface and server logic
โ โโโ utils/
โ โโโ llm_functions.py # LLM integration with Ollama
โ โโโ config.py # Configuration management
โ โโโ prompts.py # System prompts and templates
โ โโโ example_selection.py # Vector-based example retrieval
โ โโโ examples.json # Command examples database
โโโ data/examples/
โ โโโ text_2_command_examples.json # Training examples (not included in the repo, but this is where you'd place your own)
โโโ tests/ # Test suite
Key Technologies
- Ollama: Local LLM inference engine
- LangChain: Vector embeddings and example selection
- ChromaDB: Vector database for semantic search
- Typer: Modern CLI framework
- Pydantic: Configuration and data validation
๐ Prerequisites
- Python 3.12 or higher
- Ollama installed and running
- Required Ollama models:
qwen3:0.6b(for command generation)nomic-embed-text(for embeddings)
๐ ๏ธ Installation
Using Poetry (Recommended)
# Clone the repository
git clone <repository-url>
cd terminal-buddy
# Install dependencies
poetry install
# Install the package
poetry install --with dev
Using pip
# Clone the repository
git clone <repository-url>
cd terminal-buddy
# Install dependencies
pip install -r requirements.txt
# Install in development mode
pip install -e .
Setup Ollama Models
# Pull required models
ollama pull qwen3:0.6b
ollama pull nomic-embed-text
๐ Usage
Command Line Interface
TBuddy provides a simple CLI for generating commands:
# Basic usage
tb "list all files in current directory"
# Using the full command
tb query "show me disk usage"
# Start background service
tb serve
Examples
# File operations
tb "create a new directory called projects"
# Output: mkdir projects
tb "list all hidden files"
# Output: ls -a
# System information
tb "check disk space"
# Output: df -h
tb "show running processes"
# Output: ps aux
# Text processing
tb "find all .txt files"
# Output: find . -name "*.txt"
tb "search for 'error' in log files"
# Output: grep -r "error" *.log
Daemon Mode
For persistent availability, run TBuddy as a background service:
# Start the daemon
tb serve
# In another terminal, send queries
tb "check memory usage"
The daemon runs on 127.0.0.1:65432 and automatically handles multiple concurrent requests.
โ๏ธ Configuration
Configuration is managed through the Config class in src/terminal_buddy/utils/config.py:
class Config(BaseModel):
OLLAMA_MODEL_NAME: str = Field(default="qwen3:0.6b")
OLLAMA_EMBEDDINGS_MODEL_NAME: str = Field(default="nomic-embed-text")
EXAMPLES_JSON_PATH: str = Field(default="path/to/examples.json")
๐ง Development
Project Structure
main.py: Entry point with CLI commands and server logicutils/llm_functions.py: Core LLM integration using Ollamautils/example_selection.py: Vector-based example retrieval using LangChainutils/prompts.py: System prompts for command generationdata/examples/: Training data with text-to-command mappings
Adding New Examples
To improve command generation, add new examples to data/examples/text_2_command_examples.json:
{
"user_query": "Your natural language description",
"command": "corresponding bash command"
}
Running Tests
# Run tests
poetry run pytest
# Run with coverage
poetry run pytest --cov=terminal_buddy
๐ค Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Development Guidelines
- Follow PEP 8 style guidelines
- Add tests for new features
- Update documentation for API changes
- Use type hints for all functions
๐ Performance
- Response Time: ~1-3 seconds for command generation
- Memory Usage: ~2-4GB RAM (depending on model size)
- Accuracy: High accuracy for common terminal operations
- Safety: Focuses on standard, safe bash commands
๐ Security
- Local Processing: All LLM operations run locally via Ollama
- No Data Transmission: No queries or responses are sent to external services
- Safe Commands: System prompts emphasize safe, standard bash commands
- Input Validation: All inputs are validated before processing
๐ License
This project is licensed under the GNU General Public License v3 - see the LICENSE file for details.
๐ Acknowledgments
- Ollama for local LLM inference
- LangChain for vector operations and example selection
- Typer for the CLI framework
- Pydantic for data validation
๐ Support
For questions, issues, or contributions:
- Open an issue on GitHub
- Check the documentation
- Review existing examples in
data/examples/
Made with โค๏ธ for the terminal community
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file terminal_buddy-0.1.1.tar.gz.
File metadata
- Download URL: terminal_buddy-0.1.1.tar.gz
- Upload date:
- Size: 37.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.4 CPython/3.12.11 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4887fbc0eb2fff3de594f1a736967a35f4722f0db10a1a148ef261c5da692c15
|
|
| MD5 |
8d651a6bae8eba8ea41d08a1b8bf4aab
|
|
| BLAKE2b-256 |
2e376d4852df8a91cf6de95476af5be119bc416d7f36169e65001625041d2ed8
|
File details
Details for the file terminal_buddy-0.1.1-py3-none-any.whl.
File metadata
- Download URL: terminal_buddy-0.1.1-py3-none-any.whl
- Upload date:
- Size: 36.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.4 CPython/3.12.11 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
28ae5707e789b942c9639f1174370317758e49548e6fa34f201a5ef8020aeadb
|
|
| MD5 |
3db2cd995bad1fa44414c788340f5ab4
|
|
| BLAKE2b-256 |
9f033dbd20378e78806227928c7138d456b7803195794050f1a5e8c41a156a62
|