Skip to main content

A model-agnostic AI agent CLI - your AI henchman for the terminal

Project description

Henchman-AI

Your AI Henchman for the Terminal - A Model-Agnostic AI Agent CLI

PyPI version Python versions License: MIT

Henchman-AI is a powerful, terminal-based AI agent that supports multiple LLM providers (DeepSeek, OpenAI, Anthropic, Ollama, and more) through a unified interface. Inspired by gemini-cli, built for extensibility and production use.

✨ Features

  • 🤝 Multi-Agent Dev Team: Orchestrate a team of specialists (Architect, Coder, Reviewer, Tester, etc.) to solve complex engineering tasks.
  • 🔄 Model-Agnostic: Support any LLM provider through a unified abstraction layer
  • 🐍 Pythonic: Leverages Python's async ecosystem and rich libraries for optimal performance
  • 🔌 Extensible: Plugin system for tools, providers, and custom commands
  • 🚀 Production-Ready: Proper error handling, comprehensive testing, and semantic versioning
  • 🛠️ Tool Integration: Built-in support for file operations, web search, code execution, and more
  • Fast & Efficient: Async-first design with intelligent caching and rate limiting
  • 🔒 Secure: Environment-based configuration and safe execution sandboxing

📦 Installation

From PyPI (Recommended)

pip install henchman-ai

From Source

git clone https://github.com/MGPowerlytics/henchman-ai.git
cd henchman-ai
pip install -e ".[dev]"

With uv (Fastest)

uv pip install henchman-ai

🚀 Quick Start

  1. Set your API key (choose your preferred provider):

    export DEEPSEEK_API_KEY="your-api-key-here"
    # or
    export OPENAI_API_KEY="your-api-key-here"
    # or
    export ANTHROPIC_API_KEY="your-api-key-here"
    
  2. Start the CLI:

    henchman
    
  3. Or run with a prompt directly:

    henchman --prompt "Explain this Python code" < example.py
    

📖 Usage Examples

Basic Commands

# Show version
henchman --version

# Show help
henchman --help

# Interactive mode (default)
henchman

# Headless mode with prompt
henchman -p "Summarize the key points from README.md"

# Specify a provider
henchman --provider openai -p "Write a Python function to calculate fibonacci"

# Use a specific model
henchman --model gpt-4-turbo -p "Analyze this code for security issues"

File Operations

# Read and analyze a file
henchman -p "Review this code for bugs" < script.py

# Process multiple files
cat *.py | henchman -p "Find common patterns in these files"

# Generate documentation
henchman -p "Create API documentation for this module" < module.py > docs.md

⚙️ Configuration

Henchman-AI uses hierarchical configuration (later settings override earlier ones):

  1. Default settings (built-in sensible defaults)
  2. User settings: ~/.henchman/settings.yaml
  3. Workspace settings: .henchman/settings.yaml (project-specific)
  4. Environment variables (highest priority)

Example settings.yaml

# Provider configuration
providers:
  default: deepseek  # or openai, anthropic, ollama
  deepseek:
    model: deepseek-chat
    base_url: "https://api.deepseek.com"
    temperature: 0.7
  openai:
    model: gpt-4-turbo-preview
    organization: "org-xxx"

# Tool settings
tools:
  auto_accept_read: true
  shell_timeout: 60
  web_search_max_results: 5

# UI settings
ui:
  theme: "monokai"
  show_tokens: true
  streaming: true

# System settings
system:
  cache_enabled: true
  cache_ttl: 3600
  max_tokens: 4096

Environment Variables

# Provider API keys
export DEEPSEEK_API_KEY="sk-xxx"
export OPENAI_API_KEY="sk-xxx"
export ANTHROPIC_API_KEY="sk-xxx"

# Configuration overrides
export HENCHMAN_DEFAULT_PROVIDER="openai"
export HENCHMAN_DEFAULT_MODEL="gpt-4"
export HENCHMAN_TEMPERATURE="0.5"

🔌 Supported Providers

Provider Models Features
DeepSeek deepseek-chat, deepseek-coder Free tier, Code completion
OpenAI gpt-4, gpt-3.5-turbo, etc. Function calling, JSON mode
Anthropic claude-3-opus, claude-3-sonnet Long context, Constitutional AI
Ollama llama2, mistral, codellama Local models, Custom models
Custom Any OpenAI-compatible API Self-hosted, Local inference

🛠️ Development

Setup Development Environment

# Clone and install
git clone https://github.com/MGPowerlytics/henchman-ai.git
cd henchman-ai
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -e ".[dev]"

Running Tests

# Run all tests
pytest

# Run with coverage
pytest --cov=henchman --cov-report=html

# Run specific test categories
pytest tests/unit/ -v
pytest tests/integration/ -v

Code Quality

# Linting
ruff check src/ tests/
ruff format src/ tests/

# Type checking
mypy src/

# Security scanning
bandit -r src/

Building and Publishing

# Build package
hatch build

# Test build
hatch run test

# Publish to PyPI (requires credentials)
hatch publish

📚 Documentation

Online Documentation

For detailed documentation, see the docs directory in this repository:

Building Documentation Locally

You can build and view the documentation locally:

# Install documentation dependencies
pip install mkdocs mkdocs-material mkdocstrings[python]

# Build static HTML documentation
python scripts/build_docs.py

# Or serve documentation locally (live preview)
mkdocs serve

The documentation will be available at http://localhost:8000 when served locally.

🤝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for details.

🐛 Reporting Issues

Found a bug or have a feature request? Please open an issue on GitHub.

📄 License

Henchman-AI is released under the MIT License. See the LICENSE file for details.

🙏 Acknowledgments

  • Inspired by gemini-cli
  • Built with Rich for beautiful terminal output
  • Uses Pydantic for data validation
  • Powered by the Python async ecosystem

Happy coding with your AI Henchman! 🦸‍♂️🤖

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

henchman_ai-0.2.12.tar.gz (29.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

henchman_ai-0.2.12-py3-none-any.whl (171.8 kB view details)

Uploaded Python 3

File details

Details for the file henchman_ai-0.2.12.tar.gz.

File metadata

  • Download URL: henchman_ai-0.2.12.tar.gz
  • Upload date:
  • Size: 29.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for henchman_ai-0.2.12.tar.gz
Algorithm Hash digest
SHA256 bde380910707b03225a602db3f2abe5c7745e7297e1b2e2b480fe0adc7d322e2
MD5 a887768b8fb30375549d2a0da121bbdc
BLAKE2b-256 ba2a991e435eae4cb382192f7fcd15c43bdbd562b0f83b7efa1989576ec111c1

See more details on using hashes here.

File details

Details for the file henchman_ai-0.2.12-py3-none-any.whl.

File metadata

  • Download URL: henchman_ai-0.2.12-py3-none-any.whl
  • Upload date:
  • Size: 171.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for henchman_ai-0.2.12-py3-none-any.whl
Algorithm Hash digest
SHA256 a58480811a91929cc8612aee6251cf3e73c7e180ccc0d0baadd3826028d68710
MD5 c830c7ed0fe11bddeb5c241db79551d4
BLAKE2b-256 5ac341172dbb029d2892f9cf663f19672baa1cdba6a235b92fe0050c254d1796

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page