Skip to main content

Agntrick: An agentic framework for building AI-powered applications

Project description

🎩 Agntrick

Build AI agents that actually do things.

PyPI version Python 3.12+ LangChain MCP License CI Coverage


Combine local tools and MCP servers in a single, elegant runtime. Write agents in 5 lines of code. Run them anywhere.


💡 Why Agntrick?

Instead of spending days wiring together LLMs, tools, and execution environments, Agntrick gives you a production-ready setup instantly.

  • Write Less, Do More: Create a fully functional agent with just 5 lines of Python using the zero-config @AgentRegistry.register decorator.
  • Context is King (MCP): Native integration with Model Context Protocol (MCP) servers to give your agents live data (Web search, APIs, internal databases).
  • Hardcore Local Tools: Built-in blazing fast tools (ripgrep, fd, AST parsing) so your agents can explore and understand local codebases out-of-the-box.
  • Stateful & Resilient: Powered by LangGraph to support memory, cyclic reasoning, and human-in-the-loop workflows.
  • Docker-First Isolation: Every agent runs in isolated containers—no more "it works on my machine" when sharing with your team.

📦 Installation

From PyPI

pip install agntrick

# Or with development dependencies
pip install "agntrick[dev]"

From Source

git clone https://github.com/jeancsil/agntrick.git
cd agntrick
make install

🚀 Quick Start

1. Add your Brain (API Key)

You need an LLM API key to breathe life into your agents. Agntrick supports 10+ LLM providers via LangChain!

# Copy the template
cp .env.example .env

# Edit .env and paste your API key
# Choose one of the following providers:
# OPENAI_API_KEY=sk-your-key-here
# ANTHROPIC_API_KEY=sk-ant-your-key-here
# GOOGLE_API_KEY=your-google-key
# GROQ_API_KEY=gsk-your-key-here
# MISTRAL_API_KEY=your-mistral-key-here
# COHERE_API_KEY=your-cohere-key-here

# For Ollama (local), no API key needed:
# OLLAMA_BASE_URL=http://localhost:11434

2. Run Your First Agent

# List all available agents
agntrick list

# Run an agent with input
agntrick developer -i "Explain this codebase"

# Or try the learning agent with web search
agntrick learning -i "Explain quantum computing in simple terms"
🔑 Supported Environment Variables

Only one provider's API key is required. The framework auto-detects which provider to use based on available credentials.

# Anthropic (Recommended)
ANTHROPIC_API_KEY=sk-ant-your-key-here

# OpenAI
OPENAI_API_KEY=sk-your-key-here

# Google GenAI / Vertex
GOOGLE_API_KEY=your-google-key
GOOGLE_VERTEX_PROJECT_ID=your-project-id

# Mistral AI
MISTRAL_API_KEY=your-mistral-key-here

# Cohere
COHERE_API_KEY=your-cohere-key-here

# Azure OpenAI
AZURE_OPENAI_API_KEY=your-azure-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com

# AWS Bedrock
AWS_PROFILE=your-profile

# Ollama (Local, no API key needed)
OLLAMA_BASE_URL=http://localhost:11434

# Hugging Face
HUGGINGFACEHUB_API_TOKEN=your-hf-token

📖 See docs/llm-providers.md for detailed environment variable configurations and provider comparison.


🧰 Available Out of the Box

🤖 Bundled Agents

Agntrick includes several pre-built agents for common use cases:

Agent Purpose MCP Servers
developer Code Master: Read, search & edit code fetch
github-pr-reviewer PR Reviewer: Reviews diffs, posts inline comments & summaries -
learning Tutor: Step-by-step tutorials and explanations fetch, web-forager
news News Anchor: Aggregates top stories fetch
youtube Video Analyst: Extract insights from YouTube videos fetch

📖 See docs/agents.md for detailed information about each agent.


📦 Local Tools

Fast, zero-dependency tools for working with local codebases:

Tool Capability
find_files Fast search via fd
discover_structure Directory tree mapping
get_file_outline AST signature parsing
read_file_fragment Precise file reading
code_search Fast search via ripgrep
edit_file Safe file editing
youtube_transcript Extract transcripts from YouTube videos

📖 See docs/tools.md for detailed documentation of each tool.


🌐 MCP Servers

Model Context Protocol servers for extending agent capabilities:

Server Purpose
fetch Extract clean text from URLs
web-forager Web search and content fetching
kiwi-com-flight-search Search real-time flights

📖 See docs/mcp-servers.md for details on each server and how to add custom MCP servers.


🧠 LLM Providers

Agntrick supports 10 LLM providers out of the box, covering 90%+ of the market:

Provider Type Use Case
Anthropic Cloud State-of-the-art reasoning (Claude)
OpenAI Cloud GPT-4, GPT-4.1, o1 series
Azure OpenAI Cloud Enterprise OpenAI deployments
Google GenAI Cloud Gemini models via API
Google Vertex AI Cloud Gemini models via GCP
Mistral AI Cloud European privacy-focused models
Cohere Cloud Enterprise RAG and Command models
AWS Bedrock Cloud Anthropic, Titan, Meta via AWS
Ollama Local Run LLMs locally (zero API cost)
Hugging Face Cloud Open models from Hugging Face Hub

📖 See docs/llm-providers.md for detailed setup instructions.


🛠️ Build Your Own Agent

The 5-Line Superhero 🦸‍♂️

from agntrick import AgentBase, AgentRegistry

@AgentRegistry.register("my-agent", mcp_servers=["fetch"])
class MyAgent(AgentBase):
    @property
    def system_prompt(self) -> str:
        return "You are my custom agent with the power to fetch websites."

Boom. Run it instantly:

agntrick my-agent -i "Summarize https://example.com"

Advanced: Custom Local Tools 🔧

Want to add your own Python logic? Easy.

from langchain_core.tools import StructuredTool
from agntrick import AgentBase, AgentRegistry

@AgentRegistry.register("data-processor")
class DataProcessorAgent(AgentBase):
    @property
    def system_prompt(self) -> str:
        return "You process data files like a boss."

    def local_tools(self) -> list:
        return [
            StructuredTool.from_function(
                func=self.process_csv,
                name="process_csv",
                description="Process a CSV file path",
            )
        ]

    def process_csv(self, filepath: str) -> str:
        # Magic happens here ✨
        return f"Successfully processed {filepath}!"

⚙️ Configuration

Agntrick can be configured via a .agntrick.yaml file in your project root or home directory:

# .agntrick.yaml
llm:
  provider: anthropic  # or openai, google, ollama, etc.
  model: claude-sonnet-4-6  # optional model override
  temperature: 0.7

mcp:
  servers:
    - fetch
    - web-forager

logging:
  level: INFO
  file: logs/agent.log

💻 CLI Reference

Command your agents directly from the terminal.

# 📋 List all registered agents
agntrick list

# 🕵️ Get detailed info about what an agent can do
agntrick info developer

# 🚀 Run an agent with input
agntrick developer -i "Analyze the architecture of this project"

# ⏱️ Run with an execution timeout (seconds)
agntrick developer -i "Refactor this module" -t 120

# 📝 Run with debug-level verbosity
agntrick developer -i "Hello" -v

# 📜 View logs
tail -f logs/agent.log

🏗️ Architecture

Under the hood, we seamlessly bridge the gap between user intent and execution:

flowchart TB
    subgraph User [👤 User Space]
        Input[User Input]
    end

    subgraph CLI [💻 CLI - agntrick]
        Typer[Typer Interface]
    end

    subgraph Registry [📋 Registry]
        AR[AgentRegistry]
        AD[Auto-discovery]
    end

    subgraph Agents [🤖 Agents]
        Dev[developer agent]
        Learning[learning agent]
        News[news agent]
    end

    subgraph Core [🧠 Core Engine]
        AB[AgentBase]
        LG[LangGraph Runtime]
        CP[(Checkpointing)]
    end

    subgraph Tools [🧰 Tools & Skills]
        LT[Local Tools]
        MCP[MCP Tools]
    end

    subgraph External [🌍 External World]
        LLM[LLM API]
        MCPS[MCP Servers]
    end

    Input --> Typer
    Typer --> AR
    AR --> AD
    AR -->|Routes to| Dev & Learning & News

    Dev & Learning & News -->|Inherits from| AB

    AB --> LG
    LG <--> CP
    AB -->|Uses| LT
    AB -->|Uses| MCP

    LT -->|Reasoning| LLM
    MCP -->|Queries| MCPS
    MCPS -->|Provides Data| LLM

    LLM --> Output[Final Response]

🧑‍💻 Local Development

System Requirements & Setup

Requirements:

  • Python 3.12+
  • uv package manager
  • ripgrep, fd, fzf (for local tools)
# Install dependencies (blazingly fast with uv ⚡)
make install

# Run the test suite
make test

# Run agents directly
agntrick developer -i "Hello"
Useful `make` Commands
make install    # Install dependencies with uv
make test       # Run pytest with coverage
make format     # Auto-format codebase with ruff
make check      # Strict linting (mypy + ruff)
make build      # Build wheel and sdist packages
make build-clean # Remove build artifacts

🤝 Contributing

We love contributions! Check out our AGENTS.md for development guidelines.

For maintainers: See RELEASING.md for how to publish new versions to PyPI.

The Golden Rules:

  1. make check should pass without complaints.
  2. make test should stay green.
  3. Don't drop test coverage (we like our 80% mark!).

📄 License

This project is licensed under the MIT License. See LICENSE for details.


Stand on the shoulders of giants:
LangChain MCP LangGraph

If you find this useful, please consider giving it a ⭐ or buying me a coffee!
Star the repo   Buy Me A Coffee

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agntrick-0.2.3.tar.gz (66.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agntrick-0.2.3-py3-none-any.whl (59.3 kB view details)

Uploaded Python 3

File details

Details for the file agntrick-0.2.3.tar.gz.

File metadata

  • Download URL: agntrick-0.2.3.tar.gz
  • Upload date:
  • Size: 66.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agntrick-0.2.3.tar.gz
Algorithm Hash digest
SHA256 a1ed1065e3960cda93deefa91e95f6c6962b7c6994daebcb91cb91013311ee82
MD5 ca9a2efd58e0f7eabdfa04696f1d4b01
BLAKE2b-256 966bb67cb7952f7a0cbb99185f76da25b175f7b1032c09a00e03da122d3061fc

See more details on using hashes here.

Provenance

The following attestation bundles were made for agntrick-0.2.3.tar.gz:

Publisher: release.yml on jeancsil/agntrick

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agntrick-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: agntrick-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 59.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agntrick-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 9b7215f0378fe906d359ed54b3218803daa5c3064529cbab9c76499827acb352
MD5 dadb7bb1fbfdbd7388ad07daa0c752f3
BLAKE2b-256 c3cc240d07af32b7bde3c7691d4e0ddac8ca26a667a24972e5365558511452ce

See more details on using hashes here.

Provenance

The following attestation bundles were made for agntrick-0.2.3-py3-none-any.whl:

Publisher: release.yml on jeancsil/agntrick

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page