Skip to main content

A professional CLI tool for interacting with OpenAI and Anthropic LLMs with cost tracking

Project description

llmctl - Command Line LLM Interface

A powerful CLI tool to interact with various LLM providers (OpenAI, Anthropic/Claude) with interactive sessions, colored output, and persistent file attachments.

PyPI version Python 3.7+ License: MIT

โœจ Features

  • ๐ŸŽจ Colored terminal output - Beautiful, easy-to-read interface
  • ๐Ÿ’ฌ Interactive sessions - Keep conversations going without restarting
  • ๐Ÿ“Ž File attachments - Add/remove files as context during conversations
  • ๐Ÿ’พ Session persistence - Your conversations are saved automatically
  • ๐Ÿ”„ Multi-provider support - Switch between OpenAI and Claude seamlessly
  • ๐Ÿ“œ Conversation history - Review past exchanges in your session
  • ๐Ÿ’ฐ Real-time cost tracking - See exact costs after every API call
  • ๐Ÿ“Š Session statistics - Track total spending per session
  • ๐ŸŒ Cross-platform - Works on Windows, Mac, and Linux

Installation

Via pip (Recommended)

pip install llmctl

From source

git clone https://github.com/sabbiramin113008/llmctl.git
cd llmctl
pip install -e .

Quick Start

  1. Initialize llmctl:

    llmctl init
    
  2. Set your API keys:

    export OPENAI_API_KEY="sk-your-key-here"
    export ANTHROPIC_API_KEY="sk-ant-your-key-here"
    
  3. Start chatting:

    llmctl use claude:sonnet-4
    llmctl interactive
    

Usage

๐ŸŽฎ Interactive Mode (Recommended)

Start an interactive session where you can have ongoing conversations:

llmctl interactive

Interactive Commands:

  • /help - Show available commands
  • /use <provider> - Switch LLM provider (e.g., /use gpt-4)
  • /attach <file> - Attach a file as context
  • /detach <file> - Remove an attached file
  • /files - List all attached files
  • /clear - Clear conversation history
  • /clearfiles - Remove all attached files
  • /history - Show conversation history
  • /stats - Show session statistics and total costs
  • /exit or /quit - Exit the session

Example Session:

llmctl interactive

โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
  ๐Ÿš€ llmctl - Interactive LLM Session
โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•

  ๐Ÿ“ก Provider: anthropic (claude-sonnet-4-20250514)
  ๐Ÿ’พ Session:  default

โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
  ๐Ÿ“š Quick Commands:
     /help        - Show all commands
     /use <model>  - Switch LLM provider
     /attach <file> - Add file context
     /stats       - Show costs & usage
     /exit        - Exit session
โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
  ๐Ÿ’ก Tip: Type naturally - no quotes needed!
โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•

โฏ explain quantum computing

๐Ÿค– anthropic (claude-sonnet-4-20250514):
Quantum computing harnesses quantum mechanical phenomena...

โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
๐Ÿ’ฐ Cost Breakdown:
   Model: claude-sonnet-4-20250514
   Input tokens: 156 ($0.000468)
   Output tokens: 423 ($0.006345)
   Total tokens: 579
   Total cost: $0.006813
โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

โฏ /stats

โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
  ๐Ÿ“Š Session Statistics
โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•

  Session Name:    default
  Exchanges:       1 conversations
  Total Tokens:    579
  Total Cost:      $0.006813
โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•

๐Ÿ“ One-Off Questions

Ask a single question without entering interactive mode:

llmctl ask "what is fibonacci number?"
llmctl ask "write a Python function to reverse a string"

๐Ÿ”„ Provider Management

Switch between different models:

# OpenAI models
llmctl use gpt-4
llmctl use gpt-4-turbo
llmctl use gpt-4o-mini

# Claude models
llmctl use claude:sonnet-4
llmctl use claude:sonnet-4.5
llmctl use claude:opus-4
llmctl use claude:haiku-4

๐Ÿ’พ Session Management

Use named sessions to keep different conversations separate:

# Start a named session
llmctl interactive --session myproject

# Start another session
llmctl interactive --session work

Sessions are stored in ~/.cllm/sessions/ and persist across restarts.

๐Ÿ“‚ File Structure

~/.cllm/
โ”œโ”€โ”€ config.json              # Current provider and session
โ””โ”€โ”€ sessions/
    โ”œโ”€โ”€ default.json         # Default session
    โ”œโ”€โ”€ myproject.json       # Named session
    โ””โ”€โ”€ work.json            # Another session

๐ŸŽจ Color Scheme

The modern, beautiful interface features:

  • Cyan - Borders and structure
  • Blue - User input prompts
  • Magenta - Section headers and AI labels
  • Yellow - Command names and highlights
  • Green - Success messages
  • Red - Error messages
  • Black/Default - Response text (high contrast)

Advanced Examples

Code Review Workflow

llmctl interactive --session codereview

โฏ /attach app.py
โฏ /attach utils.py
โฏ /files
๐Ÿ“Ž Attached files:
  โ€ข app.py
  โ€ข utils.py

โฏ review these files for security issues

[Assistant analyzes both files...]

โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
๐Ÿ’ฐ Cost Breakdown:
   Model: claude-sonnet-4-20250514
   Input tokens: 2,847 ($0.008541)
   Output tokens: 1,234 ($0.018510)
   Total tokens: 4,081
   Total cost: $0.027051
โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

โฏ /stats

โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
  ๐Ÿ“Š Session Statistics
โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•

  Session Name:    codereview
  Exchanges:       1 conversations
  Total Tokens:    4,081
  Total Cost:      $0.027051
  Attached Files:  2 files
โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•

Cost Comparison Between Models

llmctl interactive --session comparison

โฏ /use claude:haiku-4
โฏ explain neural networks in 100 words

โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
๐Ÿ’ฐ Cost Breakdown:
   Model: claude-haiku-4-20250514
   Input tokens: 12 ($0.000010)
   Output tokens: 95 ($0.000380)
   Total tokens: 107
   Total cost: $0.000390
โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

โฏ /clear
โฏ /use gpt-4
โฏ explain neural networks in 100 words

โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
๐Ÿ’ฐ Cost Breakdown:
   Model: gpt-4
   Input tokens: 12 ($0.000360)
   Output tokens: 102 ($0.006120)
   Total tokens: 114
   Total cost: $0.006480
โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

# Haiku is 16x cheaper! ๐ŸŽ‰

Supported Providers

OpenAI

  • gpt-4 - Most capable, $30/$60 per 1M tokens
  • gpt-4-turbo - Fast and capable, $10/$30 per 1M tokens
  • gpt-4o - Optimized, $2.50/$10 per 1M tokens
  • gpt-4o-mini - Fast and cheap, $0.15/$0.60 per 1M tokens
  • gpt-3.5-turbo - Legacy, $0.50/$1.50 per 1M tokens

Anthropic (Claude)

  • sonnet-4 or claude-sonnet-4-20250514 - Balanced, $3/$15 per 1M tokens
  • sonnet-4.5 or claude-sonnet-4-5-20250929 - Latest Sonnet
  • opus-4 or claude-opus-4-20250514 - Most capable, $15/$75 per 1M tokens
  • haiku-4 or claude-haiku-4-20250514 - Fastest & cheapest, $0.80/$4 per 1M tokens

Tips & Best Practices

  1. Start with cheaper models:

    llmctl use claude:haiku-4  # Perfect for simple tasks
    
  2. Attach files for context:

    /attach main.py
    /attach config.yaml
    /attach README.md
    
  3. Use named sessions for organization:

    llmctl interactive --session client-work
    llmctl interactive --session personal-projects
    
  4. Monitor costs regularly:

    /stats  # Check spending anytime
    
  5. Clear history when switching topics:

    /clear  # Start fresh conversation
    
  6. Model selection guide:

    • Simple Q&A, summaries: haiku-4 or gpt-4o-mini
    • Code review, analysis: sonnet-4 or gpt-4o
    • Complex reasoning, research: opus-4 or gpt-4

Environment Variables

Set these in your shell profile (~/.bashrc, ~/.zshrc, etc.) for persistence:

# Add to ~/.bashrc or ~/.zshrc
export OPENAI_API_KEY="sk-your-key-here"
export ANTHROPIC_API_KEY="sk-ant-your-key-here"

Then reload:

source ~/.bashrc  # or source ~/.zshrc

Troubleshooting

Colors not showing?

Colorama is installed automatically. If colors don't work:

pip install --upgrade colorama

API key errors?

Verify keys are set:

echo $OPENAI_API_KEY
echo $ANTHROPIC_API_KEY

Command not found?

Ensure Python scripts directory is in PATH:

# Add to ~/.bashrc or ~/.zshrc
export PATH="$HOME/.local/bin:$PATH"

Session not saving?

Check permissions:

ls -la ~/.cllm/
chmod 755 ~/.cllm

Development

Install in development mode:

git clone https://github.com/sabbiramin113008/llmctl.git
cd llmctl
pip install -e .

Run tests:

pytest tests/

Build package:

python -m build

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Author

SM Sabbir Amin - GitHub

Acknowledgments

  • OpenAI for the GPT API
  • Anthropic for the Claude API
  • The Python community for amazing tools

Support


Made with โค๏ธ by developers, for developers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmctl-0.1.6.tar.gz (18.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmctl-0.1.6-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file llmctl-0.1.6.tar.gz.

File metadata

  • Download URL: llmctl-0.1.6.tar.gz
  • Upload date:
  • Size: 18.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.7

File hashes

Hashes for llmctl-0.1.6.tar.gz
Algorithm Hash digest
SHA256 dc26d7175a363e5370ecc4748d463c1885345dcbc11d0511fe9b2c8088aa7de4
MD5 75b5c897f207a3a3f9f0fd7cfe76add9
BLAKE2b-256 45810601c64701eacc19548a4766885a1a06401d37b570d90ab3e4161f6d00b3

See more details on using hashes here.

File details

Details for the file llmctl-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: llmctl-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 13.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.7

File hashes

Hashes for llmctl-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 d98884952acbdce19769173d649434991ce62602cde03c960f4f9fba2864979a
MD5 ba3a51340506ddf06f1b671b61183e8e
BLAKE2b-256 c02a131b1a1a8cca196295bbab2022433909e85201326259dafcbbb0078c9a0f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page