Skip to main content

A model-agnostic AI agent CLI - your AI henchman for the terminal

Project description

Henchman-AI

Your AI Henchman for the Terminal - A Model-Agnostic AI Agent CLI

PyPI version Python versions License: MIT

Henchman-AI is a powerful, terminal-based AI agent that supports multiple LLM providers (DeepSeek, OpenAI-compatible APIs, Anthropic, Ollama, OpenRouter, and more) through a unified interface. Inspired by gemini-cli, built for extensibility and production use.

Project Status: โœ… Complete - All 13 development phases implemented, including Multi-Agent Dev Team orchestration, MCP integration, and comprehensive tool system.

โœจ Features

  • ๐Ÿค Multi-Agent Dev Team: Orchestrate a team of specialists (Tech Lead, Planner, Explorer, Engineer, Data Engineer) to solve complex engineering tasks.
  • ๐Ÿ”„ Model-Agnostic: Support any LLM provider through a unified abstraction layer
  • ๐Ÿ Pythonic: Leverages Python's async ecosystem and rich libraries for optimal performance
  • ๐Ÿ”Œ Extensible: Plugin system for tools, providers, and custom commands
  • ๐Ÿš€ Production-Ready: Proper error handling, comprehensive testing with 100% coverage, and semantic versioning
  • ๐Ÿ› ๏ธ Built-in Tools: 34 tools for file operations, shell commands, web fetching, code analysis, testing, and multi-agent coordination
  • ๐Ÿ”— MCP Integration: Connect to external tool servers via Model Context Protocol
  • โšก Fast & Efficient: Async-first design with intelligent caching and rate limiting
  • ๐Ÿ”’ Secure: Environment-based configuration and safe execution sandboxing

๐Ÿ“ฆ Installation

Prerequisites

  • Python 3.10 or higher: Henchman-AI requires Python 3.10+
  • API Key: For your chosen LLM provider (DeepSeek, OpenAI, Anthropic, etc.)
  • Git (optional): For installing from source
  • uv (optional): For faster dependency management

From PyPI (Recommended)

pip install henchman-ai

From Source

git clone https://github.com/MGPowerlytics/henchman-ai.git
cd henchman-ai
pip install -e ".[dev]"

With uv (Fastest)

uv pip install henchman-ai

๐Ÿš€ Quick Start

  1. Set your API key (choose your preferred provider):

    # DeepSeek (default provider)
    export DEEPSEEK_API_KEY="your-api-key-here"
    
    # OpenAI or OpenAI-compatible services
    export OPENAI_API_KEY="your-api-key-here"
    
    # Anthropic Claude
    export ANTHROPIC_API_KEY="your-api-key-here"
    
    # OpenRouter
    export OPENROUTER_API_KEY="your-api-key-here"
    
    # For other providers, see the Providers documentation
    
  2. Start the CLI:

    henchman
    
  3. Or run with a prompt directly:

    henchman --prompt "Explain this Python code" < example.py
    

๐Ÿ—๏ธ Architecture

Henchman-AI features a modular, component-based architecture designed for maintainability and extensibility. The core interactive REPL (Read-Eval-Print Loop) has been refactored into specialized components:

REPL Component Architecture

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                     REPL (Orchestrator)                      โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”‚
โ”‚  โ”‚  Input   โ”‚  โ”‚  Output   โ”‚  โ”‚   Command   โ”‚  โ”‚  Tool   โ”‚  โ”‚
โ”‚  โ”‚ Handler  โ”‚โ—„โ”€โ”ค  Handler  โ”‚โ—„โ”€โ”ค  Processor  โ”‚โ—„โ”€โ”คExecutor โ”‚  โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ”‚
โ”‚         โ”‚             โ”‚              โ”‚               โ”‚       โ”‚
โ”‚         โ–ผ             โ–ผ              โ–ผ               โ–ผ       โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”‚
โ”‚  โ”‚                Multi-Agent Orchestrator               โ”‚  โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Component Responsibilities

  1. REPL (Orchestrator): Main coordination class

    • Initializes and connects all components
    • Manages the main interaction loop
    • Delegates work to specialized components
    • Maintains backward compatibility
  2. InputHandler: User input processing

    • Manages prompt sessions with history
    • Handles @file expansion and shell command detection
    • Processes keyboard interrupts and EOF
    • Validates and sanitizes user input
  3. OutputHandler: Console output and status display

    • Manages rich console output and formatting
    • Displays status bars and tool information
    • Shows welcome/goodbye messages
    • Handles event streaming and turn status
  4. CommandProcessor: Slash command execution

    • Processes /quit, /clear, /help, and other commands
    • Manages command registry and argument parsing
    • Delegates to specialized command handlers
    • Provides command completion and validation
  5. ToolExecutor: Tool execution and agent coordination

    • Executes tool calls from agents
    • Manages tool confirmation requests
    • Processes agent event streams
    • Handles tool iteration limits and cancellation

Benefits of Component Architecture

  • Single Responsibility: Each component has a clear, focused purpose
  • Testability: Components can be tested independently (100% test coverage for core components)
  • Maintainability: Smaller, focused classes are easier to understand and modify
  • Extensibility: New components can be added without modifying the REPL
  • Performance: Business logic moved out of REPL, leaving only orchestration

๐Ÿ“– Usage Examples

Basic Commands

# Show version
henchman --version

# Show help
henchman --help

# Interactive mode (default)
henchman

# Headless mode with prompt
henchman -p "Summarize the key points from README.md"

# Specify a provider
henchman --provider openai_compat -p "Write a Python function to calculate fibonacci"

# Use a specific model
henchman --model gpt-4-turbo -p "Analyze this code for security issues"

File Operations

# Read and analyze a file
henchman -p "Review this code for bugs" < script.py

# Process multiple files
cat *.py | henchman -p "Find common patterns in these files"

# Generate documentation
henchman -p "Create API documentation for this module" < module.py > docs.md

โš™๏ธ Configuration

Henchman-AI uses hierarchical configuration (later settings override earlier ones):

  1. Default settings (built-in sensible defaults)
  2. User settings: ~/.henchman/settings.yaml
  3. Workspace settings: .henchman/settings.yaml (project-specific)
  4. Environment variables (highest priority)

Example settings.yaml

# Provider configuration
providers:
  default: deepseek  # or openai_compat, anthropic, ollama, openrouter
  deepseek:
    model: deepseek-chat
    base_url: "https://api.deepseek.com"
    temperature: 0.7
  openai_compat:
    model: gpt-4-turbo-preview
    organization: "org-xxx"

# Tool settings
tools:
  auto_accept_read: true
  shell_timeout: 60
  web_search_max_results: 5

# UI settings
ui:
  theme: "monokai"
  show_tokens: true
  streaming: true

# System settings
system:
  cache_enabled: true
  cache_ttl: 3600
  max_tokens: 4096

Environment Variables

# Provider API keys
export DEEPSEEK_API_KEY="sk-xxx"
export OPENAI_API_KEY="sk-xxx"
export ANTHROPIC_API_KEY="sk-xxx"
export OPENROUTER_API_KEY="sk-xxx"
export TOGETHER_API_KEY="sk-xxx"
export GROQ_API_KEY="sk-xxx"
export FIREWORKS_API_KEY="sk-xxx"

# Configuration overrides
export HENCHMAN_DEFAULT_PROVIDER="openai_compat"
export HENCHMAN_DEFAULT_MODEL="gpt-4"
export HENCHMAN_TEMPERATURE="0.5"

๐Ÿ”Œ Supported Providers

Provider Status Configuration Name Notes
DeepSeek (deepseek) โœ… deepseek Default provider, OpenAI-compatible API
Anthropic (anthropic) โœ… anthropic Claude models (native SDK)
Ollama (ollama) โœ… ollama Local models, OpenAI-compatible API
OpenRouter (openrouter) โœ… openrouter Access to hundreds of models, OpenAI-compatible API
OpenAI-Compatible (openai_compat) โœ… openai_compat Generic provider for ANY OpenAI-compatible API (OpenAI, Together, Groq, Fireworks, etc.)
Together AI (together) โœ… together Alias for openai_compat with Together base URL
Groq (groq) โœ… groq Alias for openai_compat with Groq base URL
Fireworks AI (fireworks) โœ… fireworks Alias for openai_compat with Fireworks base URL

Note: The openai_compat provider is a generic provider for any OpenAI-compatible API. Specific providers like together, groq, and fireworks are aliases for openai_compat that require you to configure the appropriate base URL in your settings.

Configuration Names: Use these names in your settings.yaml file (e.g., providers.default: deepseek) or with the --provider CLI flag.

See Providers for complete details and configuration.

๐Ÿ› ๏ธ Development

Setup Development Environment

# Clone and install
git clone https://github.com/MGPowerlytics/henchman-ai.git
cd henchman-ai
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -e ".[dev]"

For AI Agent Development: Henchman-AI includes comprehensive copilot instructions for AI agents working on the codebase. See .github/copilot-instructions.md for architecture overview, development workflows, performance optimization strategies, and agent effectiveness guidelines.

Running Tests

# Run all tests
pytest

# Run with coverage (100% required)
pytest --cov=henchman --cov-report=html --cov-fail-under=100

# Run specific test categories
pytest tests/unit/ -v
pytest tests/integration/ -v

Code Quality

# Linting
ruff check src/ tests/
ruff format src/ tests/

# Type checking
mypy src/

# Security scanning
bandit -r src/

Building and Publishing

# Build package
hatch build

# Test build
hatch run test

# Publish to PyPI (requires credentials)
hatch publish

๐Ÿ“š Documentation

Online Documentation

For detailed documentation, see the docs directory in this repository:

Building Documentation Locally

You can build and view the documentation locally:

# Install documentation dependencies
pip install mkdocs mkdocs-material mkdocstrings[python]

# Build static HTML documentation
python scripts/build_docs.py

# Or serve documentation locally (live preview)
mkdocs serve

The documentation will be available at http://localhost:8000 when served locally.

๐Ÿค Contributing

We welcome contributions! Please see CONTRIBUTING.md for details.

๐Ÿ› Reporting Issues

Found a bug or have a feature request? Please open an issue on GitHub.

๐Ÿ“„ License

Henchman-AI is released under the MIT License. See the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • Inspired by gemini-cli
  • Built with Rich for beautiful terminal output
  • Uses Pydantic for data validation
  • Powered by the Python async ecosystem

Happy coding with your AI Henchman! ๐Ÿฆธโ€โ™‚๏ธ๐Ÿค–

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

henchman_ai-0.3.13.tar.gz (16.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

henchman_ai-0.3.13-py3-none-any.whl (624.6 kB view details)

Uploaded Python 3

File details

Details for the file henchman_ai-0.3.13.tar.gz.

File metadata

  • Download URL: henchman_ai-0.3.13.tar.gz
  • Upload date:
  • Size: 16.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for henchman_ai-0.3.13.tar.gz
Algorithm Hash digest
SHA256 6a9edf26169b9297fb639376df92baffd897f298f822dc31bd4499766d6dc522
MD5 963525009c22c2d9c59012059145db8f
BLAKE2b-256 7c0636379ba0c9398e8a456344d3181b10034e39ed9f198a94753929eb00afe0

See more details on using hashes here.

File details

Details for the file henchman_ai-0.3.13-py3-none-any.whl.

File metadata

  • Download URL: henchman_ai-0.3.13-py3-none-any.whl
  • Upload date:
  • Size: 624.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for henchman_ai-0.3.13-py3-none-any.whl
Algorithm Hash digest
SHA256 b3c9810c048bf85befdd4b6b5fc0f292227c72c52c6f976ac1ca865da0340be8
MD5 ba467c265264ee63fbe2250ca003415b
BLAKE2b-256 c41e67c889e34953df150e03fbb54dff162a9e8097385ee51ec4c6c3f3a96b82

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page