Agntrick: An agentic framework for building AI-powered applications
Project description
🎩 Agntrick
Build AI agents that actually do things.
Combine local tools and MCP servers in a single, elegant runtime. Write agents in 5 lines of code. Run them anywhere.
💡 Why Agntrick?
Instead of spending days wiring together LLMs, tools, and execution environments, Agntrick gives you a production-ready setup instantly.
- Write Less, Do More: Create a fully functional agent with just 5 lines of Python using the zero-config
@AgentRegistry.registerdecorator. - Context is King (MCP): Native integration with Model Context Protocol (MCP) servers to give your agents live data (Web search, APIs, internal databases).
- Hardcore Local Tools: Built-in blazing fast tools (
ripgrep,fd, AST parsing) so your agents can explore and understand local codebases out-of-the-box. - Stateful & Resilient: Powered by LangGraph to support memory, cyclic reasoning, and human-in-the-loop workflows.
- Docker-First Isolation: Every agent runs in isolated containers—no more "it works on my machine" when sharing with your team.
📦 Installation
From PyPI
pip install agntrick
# Or with development dependencies
pip install "agntrick[dev]"
From Source
git clone https://github.com/jeancsil/agntrick.git
cd agntrick
make install
🚀 Quick Start
1. Add your Brain (API Key)
You need an LLM API key to breathe life into your agents. Agntrick supports 10+ LLM providers via LangChain!
# Copy the template
cp .env.example .env
# Edit .env and paste your API key
# Choose one of the following providers:
# OPENAI_API_KEY=sk-your-key-here
# ANTHROPIC_API_KEY=sk-ant-your-key-here
# GOOGLE_API_KEY=your-google-key
# GROQ_API_KEY=gsk-your-key-here
# MISTRAL_API_KEY=your-mistral-key-here
# COHERE_API_KEY=your-cohere-key-here
# For Ollama (local), no API key needed:
# OLLAMA_BASE_URL=http://localhost:11434
2. Run Your First Agent
# List all available agents
agntrick list
# Run an agent with input
agntrick developer -i "Explain this codebase"
# Or try the learning agent with web search
agntrick learning -i "Explain quantum computing in simple terms"
🔑 Supported Environment Variables
Only one provider's API key is required. The framework auto-detects which provider to use based on available credentials.
# Anthropic (Recommended)
ANTHROPIC_API_KEY=sk-ant-your-key-here
# OpenAI
OPENAI_API_KEY=sk-your-key-here
# Google GenAI / Vertex
GOOGLE_API_KEY=your-google-key
GOOGLE_VERTEX_PROJECT_ID=your-project-id
# Mistral AI
MISTRAL_API_KEY=your-mistral-key-here
# Cohere
COHERE_API_KEY=your-cohere-key-here
# Azure OpenAI
AZURE_OPENAI_API_KEY=your-azure-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
# AWS Bedrock
AWS_PROFILE=your-profile
# Ollama (Local, no API key needed)
OLLAMA_BASE_URL=http://localhost:11434
# Hugging Face
HUGGINGFACEHUB_API_TOKEN=your-hf-token
📖 See docs/llm-providers.md for detailed environment variable configurations and provider comparison.
🧰 Available Out of the Box
🤖 Bundled Agents
Agntrick includes several pre-built agents for common use cases:
| Agent | Purpose | MCP Servers |
|---|---|---|
developer |
Code Master: Read, search & edit code | fetch |
github-pr-reviewer |
PR Reviewer: Reviews diffs, posts inline comments & summaries | - |
learning |
Tutor: Step-by-step tutorials and explanations | fetch, web-forager |
news |
News Anchor: Aggregates top stories | fetch |
youtube |
Video Analyst: Extract insights from YouTube videos | fetch |
📖 See docs/agents.md for detailed information about each agent.
📦 Local Tools
Fast, zero-dependency tools for working with local codebases:
| Tool | Capability |
|---|---|
find_files |
Fast search via fd |
discover_structure |
Directory tree mapping |
get_file_outline |
AST signature parsing |
read_file_fragment |
Precise file reading |
code_search |
Fast search via ripgrep |
edit_file |
Safe file editing |
youtube_transcript |
Extract transcripts from YouTube videos |
📖 See docs/tools.md for detailed documentation of each tool.
🌐 MCP Servers
Model Context Protocol servers for extending agent capabilities:
| Server | Purpose |
|---|---|
fetch |
Extract clean text from URLs |
web-forager |
Web search and content fetching |
kiwi-com-flight-search |
Search real-time flights |
📖 See docs/mcp-servers.md for details on each server and how to add custom MCP servers.
🧠 LLM Providers
Agntrick supports 10 LLM providers out of the box, covering 90%+ of the market:
| Provider | Type | Use Case |
|---|---|---|
| Anthropic | Cloud | State-of-the-art reasoning (Claude) |
| OpenAI | Cloud | GPT-4, GPT-4.1, o1 series |
| Azure OpenAI | Cloud | Enterprise OpenAI deployments |
| Google GenAI | Cloud | Gemini models via API |
| Google Vertex AI | Cloud | Gemini models via GCP |
| Mistral AI | Cloud | European privacy-focused models |
| Cohere | Cloud | Enterprise RAG and Command models |
| AWS Bedrock | Cloud | Anthropic, Titan, Meta via AWS |
| Ollama | Local | Run LLMs locally (zero API cost) |
| Hugging Face | Cloud | Open models from Hugging Face Hub |
📖 See docs/llm-providers.md for detailed setup instructions.
🛠️ Build Your Own Agent
The 5-Line Superhero 🦸♂️
from agntrick import AgentBase, AgentRegistry
@AgentRegistry.register("my-agent", mcp_servers=["fetch"])
class MyAgent(AgentBase):
@property
def system_prompt(self) -> str:
return "You are my custom agent with the power to fetch websites."
Boom. Run it instantly:
agntrick my-agent -i "Summarize https://example.com"
Advanced: Custom Local Tools 🔧
Want to add your own Python logic? Easy.
from langchain_core.tools import StructuredTool
from agntrick import AgentBase, AgentRegistry
@AgentRegistry.register("data-processor")
class DataProcessorAgent(AgentBase):
@property
def system_prompt(self) -> str:
return "You process data files like a boss."
def local_tools(self) -> list:
return [
StructuredTool.from_function(
func=self.process_csv,
name="process_csv",
description="Process a CSV file path",
)
]
def process_csv(self, filepath: str) -> str:
# Magic happens here ✨
return f"Successfully processed {filepath}!"
⚙️ Configuration
Agntrick can be configured via a .agntrick.yaml file in your project root or home directory:
# .agntrick.yaml
llm:
provider: anthropic # or openai, google, ollama, etc.
model: claude-sonnet-4-6 # optional model override
temperature: 0.7
mcp:
servers:
- fetch
- web-forager
logging:
level: INFO
file: logs/agent.log
💻 CLI Reference
Command your agents directly from the terminal.
# 📋 List all registered agents
agntrick list
# 🕵️ Get detailed info about what an agent can do
agntrick info developer
# 🚀 Run an agent with input
agntrick developer -i "Analyze the architecture of this project"
# ⏱️ Run with an execution timeout (seconds)
agntrick developer -i "Refactor this module" -t 120
# 📝 Run with debug-level verbosity
agntrick developer -i "Hello" -v
# 📜 View logs
tail -f logs/agent.log
🏗️ Architecture
Under the hood, we seamlessly bridge the gap between user intent and execution:
flowchart TB
subgraph User [👤 User Space]
Input[User Input]
end
subgraph CLI [💻 CLI - agntrick]
Typer[Typer Interface]
end
subgraph Registry [📋 Registry]
AR[AgentRegistry]
AD[Auto-discovery]
end
subgraph Agents [🤖 Agents]
Dev[developer agent]
Learning[learning agent]
News[news agent]
end
subgraph Core [🧠 Core Engine]
AB[AgentBase]
LG[LangGraph Runtime]
CP[(Checkpointing)]
end
subgraph Tools [🧰 Tools & Skills]
LT[Local Tools]
MCP[MCP Tools]
end
subgraph External [🌍 External World]
LLM[LLM API]
MCPS[MCP Servers]
end
Input --> Typer
Typer --> AR
AR --> AD
AR -->|Routes to| Dev & Learning & News
Dev & Learning & News -->|Inherits from| AB
AB --> LG
LG <--> CP
AB -->|Uses| LT
AB -->|Uses| MCP
LT -->|Reasoning| LLM
MCP -->|Queries| MCPS
MCPS -->|Provides Data| LLM
LLM --> Output[Final Response]
🧑💻 Local Development
System Requirements & Setup
Requirements:
- Python 3.12+
uvpackage managerripgrep,fd,fzf(for local tools)
# Install dependencies (blazingly fast with uv ⚡)
make install
# Run the test suite
make test
# Run agents directly
agntrick developer -i "Hello"
Useful `make` Commands
make install # Install dependencies with uv
make test # Run pytest with coverage
make format # Auto-format codebase with ruff
make check # Strict linting (mypy + ruff)
make build # Build wheel and sdist packages
make build-clean # Remove build artifacts
🤝 Contributing
We love contributions! Check out our AGENTS.md for development guidelines.
For maintainers: See RELEASING.md for how to publish new versions to PyPI.
The Golden Rules:
make checkshould pass without complaints.make testshould stay green.- Don't drop test coverage (we like our 80% mark!).
📄 License
This project is licensed under the MIT License. See LICENSE for details.
Stand on the shoulders of giants:
If you find this useful, please consider giving it a ⭐ or buying me a coffee!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agntrick-0.2.5.tar.gz.
File metadata
- Download URL: agntrick-0.2.5.tar.gz
- Upload date:
- Size: 66.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5b8757ab6a03250fd1cef7d66a1c3ee3fbe66215b0377d3dda6aa3de03f00bd2
|
|
| MD5 |
ec37af9b5ca1b489d34960782c6af2e6
|
|
| BLAKE2b-256 |
1abd23c8e3f8424f0d8f8026556df26a64e04f6543c4e67247b0c56725ced6a0
|
Provenance
The following attestation bundles were made for agntrick-0.2.5.tar.gz:
Publisher:
release.yml on jeancsil/agntrick
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
agntrick-0.2.5.tar.gz -
Subject digest:
5b8757ab6a03250fd1cef7d66a1c3ee3fbe66215b0377d3dda6aa3de03f00bd2 - Sigstore transparency entry: 1092179676
- Sigstore integration time:
-
Permalink:
jeancsil/agntrick@b859a5eec87153cb6b194206f913b271019dcc06 -
Branch / Tag:
refs/tags/v0.3.0 - Owner: https://github.com/jeancsil
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b859a5eec87153cb6b194206f913b271019dcc06 -
Trigger Event:
release
-
Statement type:
File details
Details for the file agntrick-0.2.5-py3-none-any.whl.
File metadata
- Download URL: agntrick-0.2.5-py3-none-any.whl
- Upload date:
- Size: 59.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fa7e373b846101126692486a8f429bcc2f2e9ddc8fd385a3629081e75689d40c
|
|
| MD5 |
baab113b5f342cf1676ec8ff2fb42223
|
|
| BLAKE2b-256 |
d64d88bc6ec20a9fb5237fce1e5df38befaa19bea6944fcb0948b02237f48ca8
|
Provenance
The following attestation bundles were made for agntrick-0.2.5-py3-none-any.whl:
Publisher:
release.yml on jeancsil/agntrick
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
agntrick-0.2.5-py3-none-any.whl -
Subject digest:
fa7e373b846101126692486a8f429bcc2f2e9ddc8fd385a3629081e75689d40c - Sigstore transparency entry: 1092179694
- Sigstore integration time:
-
Permalink:
jeancsil/agntrick@b859a5eec87153cb6b194206f913b271019dcc06 -
Branch / Tag:
refs/tags/v0.3.0 - Owner: https://github.com/jeancsil
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b859a5eec87153cb6b194206f913b271019dcc06 -
Trigger Event:
release
-
Statement type: