Skip to main content

A terminal assistant powered by On-Device LLM

Project description

TBuddy - Terminal Assistant Powered by On-Device LLM

Python 3.12+ Poetry License

TBuddy is an intelligent terminal assistant that converts natural language queries into bash commands using on-device (<1B params) Large Language Models (LLMs). It provides both a command-line interface and a daemon service for seamless terminal command generation.

๐Ÿš€ Features

  • Natural Language to Bash Commands: Convert plain English descriptions into executable bash commands
  • On-Device LLM Integration: Uses Ollama with extremely small sub 1Billion parameter local models for balancing privacy, speed, memory usage and accuracy
  • Semantic Example Selection: Leverages vector embeddings to find relevant command examples from a pre-curated list
  • Dual Operation Modes:
    • One-off command generation
    • Background daemon service for persistent, faster availability
  • Rich Example Database: Comprehensive collection of text-to-command examples (available per request)
  • Safe Command Generation: Focuses on standard, secure and safe bash commands

๐Ÿ—๏ธ Architecture

Core Components

terminal-buddy/
โ”œโ”€โ”€ src/terminal_buddy/
โ”‚   โ”œโ”€โ”€ main.py              # CLI interface and server logic
โ”‚   โ””โ”€โ”€ utils/
โ”‚       โ”œโ”€โ”€ llm_functions.py     # LLM integration with Ollama
โ”‚       โ”œโ”€โ”€ config.py            # Configuration management
โ”‚       โ”œโ”€โ”€ prompts.py           # System prompts and templates
โ”‚       โ”œโ”€โ”€ example_selection.py # Vector-based example retrieval
โ”‚       โ””โ”€โ”€ examples.json        # Command examples database
โ”œโ”€โ”€ data/examples/
โ”‚   โ””โ”€โ”€ text_2_command_examples.json  # Training examples (not included in the repo, but this is where you'd place your own)
โ””โ”€โ”€ tests/                    # Test suite

Key Technologies

  • Ollama: Local LLM inference engine
  • LangChain: Vector embeddings and example selection
  • ChromaDB: Vector database for semantic search
  • Typer: Modern CLI framework
  • Pydantic: Configuration and data validation

๐Ÿ“‹ Prerequisites

  • Python 3.12 or higher
  • Ollama installed and running
  • Required Ollama models:
    • qwen3:0.6b (for command generation)
    • nomic-embed-text (for embeddings)

๐Ÿ› ๏ธ Installation

Using Poetry (Recommended)

# Clone the repository
git clone <repository-url>
cd terminal-buddy

# Install dependencies
poetry install

# Install the package
poetry install --with dev

Using pip

# Clone the repository
git clone <repository-url>
cd terminal-buddy

# Install dependencies
pip install -r requirements.txt

# Install in development mode
pip install -e .

Setup Ollama Models

# Pull required models
ollama pull qwen3:0.6b
ollama pull nomic-embed-text

๐Ÿš€ Usage

Command Line Interface

TBuddy provides a simple CLI for generating commands:

# Basic usage
tb "list all files in current directory"

# Using the full command
tb query "show me disk usage"

# Start background service
tb serve

Examples

# File operations
tb "create a new directory called projects"
# Output: mkdir projects

tb "list all hidden files"
# Output: ls -a

# System information
tb "check disk space"
# Output: df -h

tb "show running processes"
# Output: ps aux

# Text processing
tb "find all .txt files"
# Output: find . -name "*.txt"

tb "search for 'error' in log files"
# Output: grep -r "error" *.log

Daemon Mode

For persistent availability, run TBuddy as a background service:

# Start the daemon
tb serve

# In another terminal, send queries
tb "check memory usage"

The daemon runs on 127.0.0.1:65432 and automatically handles multiple concurrent requests.

โš™๏ธ Configuration

Configuration is managed through the Config class in src/terminal_buddy/utils/config.py:

class Config(BaseModel):
    OLLAMA_MODEL_NAME: str = Field(default="qwen3:0.6b")
    OLLAMA_EMBEDDINGS_MODEL_NAME: str = Field(default="nomic-embed-text")
    EXAMPLES_JSON_PATH: str = Field(default="path/to/examples.json")

๐Ÿ”ง Development

Project Structure

  • main.py: Entry point with CLI commands and server logic
  • utils/llm_functions.py: Core LLM integration using Ollama
  • utils/example_selection.py: Vector-based example retrieval using LangChain
  • utils/prompts.py: System prompts for command generation
  • data/examples/: Training data with text-to-command mappings

Adding New Examples

To improve command generation, add new examples to data/examples/text_2_command_examples.json:

{
    "user_query": "Your natural language description",
    "command": "corresponding bash command"
}

Running Tests

# Run tests
poetry run pytest

# Run with coverage
poetry run pytest --cov=terminal_buddy

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Development Guidelines

  • Follow PEP 8 style guidelines
  • Add tests for new features
  • Update documentation for API changes
  • Use type hints for all functions

๐Ÿ“Š Performance

  • Response Time: ~1-3 seconds for command generation
  • Memory Usage: ~2-4GB RAM (depending on model size)
  • Accuracy: High accuracy for common terminal operations
  • Safety: Focuses on standard, safe bash commands

๐Ÿ”’ Security

  • Local Processing: All LLM operations run locally via Ollama
  • No Data Transmission: No queries or responses are sent to external services
  • Safe Commands: System prompts emphasize safe, standard bash commands
  • Input Validation: All inputs are validated before processing

๐Ÿ“ License

This project is licensed under the GNU General Public License v3 - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • Ollama for local LLM inference
  • LangChain for vector operations and example selection
  • Typer for the CLI framework
  • Pydantic for data validation

๐Ÿ“ž Support

For questions, issues, or contributions:

  • Open an issue on GitHub
  • Check the documentation
  • Review existing examples in data/examples/

Made with โค๏ธ for the terminal community

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

terminal_buddy-0.1.1.tar.gz (37.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

terminal_buddy-0.1.1-py3-none-any.whl (36.7 kB view details)

Uploaded Python 3

File details

Details for the file terminal_buddy-0.1.1.tar.gz.

File metadata

  • Download URL: terminal_buddy-0.1.1.tar.gz
  • Upload date:
  • Size: 37.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.4 CPython/3.12.11 Linux/6.11.0-1018-azure

File hashes

Hashes for terminal_buddy-0.1.1.tar.gz
Algorithm Hash digest
SHA256 4887fbc0eb2fff3de594f1a736967a35f4722f0db10a1a148ef261c5da692c15
MD5 8d651a6bae8eba8ea41d08a1b8bf4aab
BLAKE2b-256 2e376d4852df8a91cf6de95476af5be119bc416d7f36169e65001625041d2ed8

See more details on using hashes here.

File details

Details for the file terminal_buddy-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: terminal_buddy-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 36.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.4 CPython/3.12.11 Linux/6.11.0-1018-azure

File hashes

Hashes for terminal_buddy-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 28ae5707e789b942c9639f1174370317758e49548e6fa34f201a5ef8020aeadb
MD5 3db2cd995bad1fa44414c788340f5ab4
BLAKE2b-256 9f033dbd20378e78806227928c7138d456b7803195794050f1a5e8c41a156a62

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page