Skip to main content

Agntrick: An agentic framework for building AI-powered applications

Project description

🎩 Agntrick

Build AI agents that actually do things.

PyPI version Python 3.12+ LangChain MCP License CI Coverage


Combine local tools and MCP servers in a single, elegant runtime. Write agents in 5 lines of code. Run them anywhere.


💡 Why Agntrick?

Instead of spending days wiring together LLMs, tools, and execution environments, Agntrick gives you a production-ready setup instantly.

  • Write Less, Do More: Create a fully functional agent with just 5 lines of Python using the zero-config @AgentRegistry.register decorator.
  • Context is King (MCP): Native integration with Model Context Protocol (MCP) servers to give your agents live data (Web search, APIs, internal databases).
  • Hardcore Local Tools: Built-in blazing fast tools (ripgrep, fd, AST parsing) so your agents can explore and understand local codebases out-of-the-box.
  • Stateful & Resilient: Powered by LangGraph to support memory, cyclic reasoning, and human-in-the-loop workflows.
  • Docker-First Isolation: Every agent runs in isolated containers—no more "it works on my machine" when sharing with your team.

📦 Installation

From PyPI

pip install agntrick

# Or with development dependencies
pip install "agntrick[dev]"

From Source

git clone https://github.com/jeancsil/agntrick.git
cd agntrick
make install

🚀 Quick Start

1. Add your Brain (API Key)

You need an LLM API key to breathe life into your agents. Agntrick supports 10+ LLM providers via LangChain!

# Copy the template
cp .env.example .env

# Edit .env and paste your API key
# Choose one of the following providers:
# OPENAI_API_KEY=sk-your-key-here
# ANTHROPIC_API_KEY=sk-ant-your-key-here
# GOOGLE_API_KEY=your-google-key
# GROQ_API_KEY=gsk-your-key-here
# MISTRAL_API_KEY=your-mistral-key-here
# COHERE_API_KEY=your-cohere-key-here

# For Ollama (local), no API key needed:
# OLLAMA_BASE_URL=http://localhost:11434

2. Run Your First Agent

# List all available agents
agntrick list

# Run an agent with input
agntrick developer -i "Explain this codebase"

# Or try the learning agent with web search
agntrick learning -i "Explain quantum computing in simple terms"
🔑 Supported Environment Variables

Only one provider's API key is required. The framework auto-detects which provider to use based on available credentials.

# Anthropic (Recommended)
ANTHROPIC_API_KEY=sk-ant-your-key-here

# OpenAI
OPENAI_API_KEY=sk-your-key-here

# Google GenAI / Vertex
GOOGLE_API_KEY=your-google-key
GOOGLE_VERTEX_PROJECT_ID=your-project-id

# Mistral AI
MISTRAL_API_KEY=your-mistral-key-here

# Cohere
COHERE_API_KEY=your-cohere-key-here

# Azure OpenAI
AZURE_OPENAI_API_KEY=your-azure-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com

# AWS Bedrock
AWS_PROFILE=your-profile

# Ollama (Local, no API key needed)
OLLAMA_BASE_URL=http://localhost:11434

# Hugging Face
HUGGINGFACEHUB_API_TOKEN=your-hf-token

📖 See docs/llm-providers.md for detailed environment variable configurations and provider comparison.


🧰 Available Out of the Box

🤖 Bundled Agents

Agntrick includes several pre-built agents for common use cases:

Agent Purpose MCP Servers
developer Code Master: Read, search & edit code fetch
github-pr-reviewer PR Reviewer: Reviews diffs, posts inline comments & summaries -
learning Tutor: Step-by-step tutorials and explanations fetch, web-forager
news News Anchor: Aggregates top stories fetch
youtube Video Analyst: Extract insights from YouTube videos fetch

📖 See docs/agents.md for detailed information about each agent.


📦 Local Tools

Fast, zero-dependency tools for working with local codebases:

Tool Capability
find_files Fast search via fd
discover_structure Directory tree mapping
get_file_outline AST signature parsing
read_file_fragment Precise file reading
code_search Fast search via ripgrep
edit_file Safe file editing
youtube_transcript Extract transcripts from YouTube videos

📖 See docs/tools.md for detailed documentation of each tool.


🌐 MCP Servers

Model Context Protocol servers for extending agent capabilities:

Server Purpose
fetch Extract clean text from URLs
web-forager Web search and content fetching
kiwi-com-flight-search Search real-time flights

📖 See docs/mcp-servers.md for details on each server and how to add custom MCP servers.


🧠 LLM Providers

Agntrick supports 10 LLM providers out of the box, covering 90%+ of the market:

Provider Type Use Case
Anthropic Cloud State-of-the-art reasoning (Claude)
OpenAI Cloud GPT-4, GPT-4.1, o1 series
Azure OpenAI Cloud Enterprise OpenAI deployments
Google GenAI Cloud Gemini models via API
Google Vertex AI Cloud Gemini models via GCP
Mistral AI Cloud European privacy-focused models
Cohere Cloud Enterprise RAG and Command models
AWS Bedrock Cloud Anthropic, Titan, Meta via AWS
Ollama Local Run LLMs locally (zero API cost)
Hugging Face Cloud Open models from Hugging Face Hub

📖 See docs/llm-providers.md for detailed setup instructions.


🛠️ Build Your Own Agent

The 5-Line Superhero 🦸‍♂️

from agntrick import AgentBase, AgentRegistry

@AgentRegistry.register("my-agent", mcp_servers=["fetch"])
class MyAgent(AgentBase):
    @property
    def system_prompt(self) -> str:
        return "You are my custom agent with the power to fetch websites."

Boom. Run it instantly:

agntrick my-agent -i "Summarize https://example.com"

Advanced: Custom Local Tools 🔧

Want to add your own Python logic? Easy.

from langchain_core.tools import StructuredTool
from agntrick import AgentBase, AgentRegistry

@AgentRegistry.register("data-processor")
class DataProcessorAgent(AgentBase):
    @property
    def system_prompt(self) -> str:
        return "You process data files like a boss."

    def local_tools(self) -> list:
        return [
            StructuredTool.from_function(
                func=self.process_csv,
                name="process_csv",
                description="Process a CSV file path",
            )
        ]

    def process_csv(self, filepath: str) -> str:
        # Magic happens here ✨
        return f"Successfully processed {filepath}!"

⚙️ Configuration

Agntrick can be configured via a .agntrick.yaml file in your project root or home directory:

# .agntrick.yaml
llm:
  provider: anthropic  # or openai, google, ollama, etc.
  model: claude-sonnet-4-6  # optional model override
  temperature: 0.7

mcp:
  servers:
    - fetch
    - web-forager

logging:
  level: INFO
  file: logs/agent.log

💻 CLI Reference

Command your agents directly from the terminal.

# 📋 List all registered agents
agntrick list

# 🕵️ Get detailed info about what an agent can do
agntrick info developer

# 🚀 Run an agent with input
agntrick developer -i "Analyze the architecture of this project"

# ⏱️ Run with an execution timeout (seconds)
agntrick developer -i "Refactor this module" -t 120

# 📝 Run with debug-level verbosity
agntrick developer -i "Hello" -v

# 📜 View logs
tail -f logs/agent.log

🏗️ Architecture

Under the hood, we seamlessly bridge the gap between user intent and execution:

flowchart TB
    subgraph User [👤 User Space]
        Input[User Input]
    end

    subgraph CLI [💻 CLI - agntrick]
        Typer[Typer Interface]
    end

    subgraph Registry [📋 Registry]
        AR[AgentRegistry]
        AD[Auto-discovery]
    end

    subgraph Agents [🤖 Agents]
        Dev[developer agent]
        Learning[learning agent]
        News[news agent]
    end

    subgraph Core [🧠 Core Engine]
        AB[AgentBase]
        LG[LangGraph Runtime]
        CP[(Checkpointing)]
    end

    subgraph Tools [🧰 Tools & Skills]
        LT[Local Tools]
        MCP[MCP Tools]
    end

    subgraph External [🌍 External World]
        LLM[LLM API]
        MCPS[MCP Servers]
    end

    Input --> Typer
    Typer --> AR
    AR --> AD
    AR -->|Routes to| Dev & Learning & News

    Dev & Learning & News -->|Inherits from| AB

    AB --> LG
    LG <--> CP
    AB -->|Uses| LT
    AB -->|Uses| MCP

    LT -->|Reasoning| LLM
    MCP -->|Queries| MCPS
    MCPS -->|Provides Data| LLM

    LLM --> Output[Final Response]

🧑‍💻 Local Development

System Requirements & Setup

Requirements:

  • Python 3.12+
  • uv package manager
  • ripgrep, fd, fzf (for local tools)
# Install dependencies (blazingly fast with uv ⚡)
make install

# Run the test suite
make test

# Run agents directly
agntrick developer -i "Hello"
Useful `make` Commands
make install    # Install dependencies with uv
make test       # Run pytest with coverage
make format     # Auto-format codebase with ruff
make check      # Strict linting (mypy + ruff)
make build      # Build wheel and sdist packages
make build-clean # Remove build artifacts

🤝 Contributing

We love contributions! Check out our AGENTS.md for development guidelines.

For maintainers: See RELEASING.md for how to publish new versions to PyPI.

The Golden Rules:

  1. make check should pass without complaints.
  2. make test should stay green.
  3. Don't drop test coverage (we like our 80% mark!).

📄 License

This project is licensed under the MIT License. See LICENSE for details.


Stand on the shoulders of giants:
LangChain MCP LangGraph

If you find this useful, please consider giving it a ⭐ or buying me a coffee!
Star the repo   Buy Me A Coffee

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agntrick-0.2.4.tar.gz (66.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agntrick-0.2.4-py3-none-any.whl (59.3 kB view details)

Uploaded Python 3

File details

Details for the file agntrick-0.2.4.tar.gz.

File metadata

  • Download URL: agntrick-0.2.4.tar.gz
  • Upload date:
  • Size: 66.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agntrick-0.2.4.tar.gz
Algorithm Hash digest
SHA256 328f47dc7222e35b8e262f3a53d345d148e9e8e08a380f6319bc962ee5bab074
MD5 8f987607566c45588ca76740355b357a
BLAKE2b-256 5a29c8d0fa0a271c4a73d6468aa96ae317edcb6371d83e4df70a7204ccadc031

See more details on using hashes here.

Provenance

The following attestation bundles were made for agntrick-0.2.4.tar.gz:

Publisher: release.yml on jeancsil/agntrick

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agntrick-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: agntrick-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 59.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agntrick-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 de668552b7523b9aa5c68e2d782fdd43f69feba2a56e1626072434041b196f59
MD5 ef33b268278d2637360cffc96858a192
BLAKE2b-256 056d8c85ff129f596d7b49a2e62c19b854d3107a397416d423bf7a182d5e8c79

See more details on using hashes here.

Provenance

The following attestation bundles were made for agntrick-0.2.4-py3-none-any.whl:

Publisher: release.yml on jeancsil/agntrick

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page