Skip to main content

A model-agnostic AI agent CLI - your AI henchman for the terminal

Project description

Henchman-AI

Your AI Henchman for the Terminal - A Model-Agnostic AI Agent CLI

PyPI version Python versions License: MIT

Henchman-AI is a powerful, terminal-based AI agent that supports multiple LLM providers (DeepSeek, OpenAI, Anthropic, Ollama, and more) through a unified interface. Inspired by gemini-cli, built for extensibility and production use.

✨ Features

  • 🤝 Multi-Agent Dev Team: Orchestrate a team of specialists (Architect, Coder, Reviewer, Tester, etc.) to solve complex engineering tasks.
  • 🔄 Model-Agnostic: Support any LLM provider through a unified abstraction layer
  • 🐍 Pythonic: Leverages Python's async ecosystem and rich libraries for optimal performance
  • 🔌 Extensible: Plugin system for tools, providers, and custom commands
  • 🚀 Production-Ready: Proper error handling, comprehensive testing, and semantic versioning
  • 🛠️ Tool Integration: Built-in support for file operations, web search, code execution, and more
  • Fast & Efficient: Async-first design with intelligent caching and rate limiting
  • 🔒 Secure: Environment-based configuration and safe execution sandboxing

📦 Installation

From PyPI (Recommended)

pip install henchman-ai

From Source

git clone https://github.com/MGPowerlytics/henchman-ai.git
cd henchman-ai
pip install -e ".[dev]"

With uv (Fastest)

uv pip install henchman-ai

🚀 Quick Start

  1. Set your API key (choose your preferred provider):

    export DEEPSEEK_API_KEY="your-api-key-here"
    # or
    export OPENAI_API_KEY="your-api-key-here"
    # or
    export ANTHROPIC_API_KEY="your-api-key-here"
    
  2. Start the CLI:

    henchman
    
  3. Or run with a prompt directly:

    henchman --prompt "Explain this Python code" < example.py
    

📖 Usage Examples

Basic Commands

# Show version
henchman --version

# Show help
henchman --help

# Interactive mode (default)
henchman

# Headless mode with prompt
henchman -p "Summarize the key points from README.md"

# Specify a provider
henchman --provider openai -p "Write a Python function to calculate fibonacci"

# Use a specific model
henchman --model gpt-4-turbo -p "Analyze this code for security issues"

File Operations

# Read and analyze a file
henchman -p "Review this code for bugs" < script.py

# Process multiple files
cat *.py | henchman -p "Find common patterns in these files"

# Generate documentation
henchman -p "Create API documentation for this module" < module.py > docs.md

⚙️ Configuration

Henchman-AI uses hierarchical configuration (later settings override earlier ones):

  1. Default settings (built-in sensible defaults)
  2. User settings: ~/.henchman/settings.yaml
  3. Workspace settings: .henchman/settings.yaml (project-specific)
  4. Environment variables (highest priority)

Example settings.yaml

# Provider configuration
providers:
  default: deepseek  # or openai, anthropic, ollama
  deepseek:
    model: deepseek-chat
    base_url: "https://api.deepseek.com"
    temperature: 0.7
  openai:
    model: gpt-4-turbo-preview
    organization: "org-xxx"

# Tool settings
tools:
  auto_accept_read: true
  shell_timeout: 60
  web_search_max_results: 5

# UI settings
ui:
  theme: "monokai"
  show_tokens: true
  streaming: true

# System settings
system:
  cache_enabled: true
  cache_ttl: 3600
  max_tokens: 4096

Environment Variables

# Provider API keys
export DEEPSEEK_API_KEY="sk-xxx"
export OPENAI_API_KEY="sk-xxx"
export ANTHROPIC_API_KEY="sk-xxx"

# Configuration overrides
export HENCHMAN_DEFAULT_PROVIDER="openai"
export HENCHMAN_DEFAULT_MODEL="gpt-4"
export HENCHMAN_TEMPERATURE="0.5"

🔌 Supported Providers

Provider Models Features
DeepSeek deepseek-chat, deepseek-coder Free tier, Code completion
OpenAI gpt-4, gpt-3.5-turbo, etc. Function calling, JSON mode
Anthropic claude-3-opus, claude-3-sonnet Long context, Constitutional AI
Ollama llama2, mistral, codellama Local models, Custom models
Custom Any OpenAI-compatible API Self-hosted, Local inference

🛠️ Development

Setup Development Environment

# Clone and install
git clone https://github.com/MGPowerlytics/henchman-ai.git
cd henchman-ai
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -e ".[dev]"

Running Tests

# Run all tests
pytest

# Run with coverage
pytest --cov=henchman --cov-report=html

# Run specific test categories
pytest tests/unit/ -v
pytest tests/integration/ -v

Code Quality

# Linting
ruff check src/ tests/
ruff format src/ tests/

# Type checking
mypy src/

# Security scanning
bandit -r src/

Building and Publishing

# Build package
hatch build

# Test build
hatch run test

# Publish to PyPI (requires credentials)
hatch publish

📚 Documentation

Online Documentation

For detailed documentation, see the docs directory in this repository:

Building Documentation Locally

You can build and view the documentation locally:

# Install documentation dependencies
pip install mkdocs mkdocs-material mkdocstrings[python]

# Build static HTML documentation
python scripts/build_docs.py

# Or serve documentation locally (live preview)
mkdocs serve

The documentation will be available at http://localhost:8000 when served locally.

🤝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for details.

🐛 Reporting Issues

Found a bug or have a feature request? Please open an issue on GitHub.

📄 License

Henchman-AI is released under the MIT License. See the LICENSE file for details.

🙏 Acknowledgments

  • Inspired by gemini-cli
  • Built with Rich for beautiful terminal output
  • Uses Pydantic for data validation
  • Powered by the Python async ecosystem

Happy coding with your AI Henchman! 🦸‍♂️🤖

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

henchman_ai-0.2.13.tar.gz (29.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

henchman_ai-0.2.13-py3-none-any.whl (179.6 kB view details)

Uploaded Python 3

File details

Details for the file henchman_ai-0.2.13.tar.gz.

File metadata

  • Download URL: henchman_ai-0.2.13.tar.gz
  • Upload date:
  • Size: 29.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for henchman_ai-0.2.13.tar.gz
Algorithm Hash digest
SHA256 11d50adb8b5a5e06822bed4c434be1d56bdcb04fe6f260ab63d1931ad08ebf28
MD5 f4a2bc40b4f9794b0b7e35764858bcd7
BLAKE2b-256 b2305940707c4c62aa2467ca8c129c612f5501e557266d1cbd8c8da3fbd3a83b

See more details on using hashes here.

File details

Details for the file henchman_ai-0.2.13-py3-none-any.whl.

File metadata

  • Download URL: henchman_ai-0.2.13-py3-none-any.whl
  • Upload date:
  • Size: 179.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for henchman_ai-0.2.13-py3-none-any.whl
Algorithm Hash digest
SHA256 f85b97a92ad060e06dbb4f5aa97b3049317c832599fa015126c1e2b68b858d23
MD5 649bc106e543ca0ba307186fec845567
BLAKE2b-256 25567464b22150d85557e4adf3aa0c70121212577ae09b59a29db2bf9de33759

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page