Agentic AI out of the box
Project description
Cogency
3-line AI agents that just work
Cogency is a multistep reasoning framework that makes building AI agents stupidly simple. Auto-detects providers, intelligently routes tools, streams transparent reasoning.
Installation
# Current broken until next release v0.3.1 (pending)
# Recommended: Fork and install from source
pip install cogency
Quick Start
import asyncio
from cogency import Agent
async def main():
# That's it. Auto-detects LLM from .env
agent = Agent("assistant")
result = await agent.run("What is 25 * 43?", mode="summary")
print(result)
asyncio.run(main())
Requirements: Python 3.9+, API key in .env
Core Philosophy
- Zero ceremony - 3 lines to working agent
- Magical auto-detection - Detects OpenAI, Anthropic, Gemini, Grok, Mistral from environment
- Intelligent tool routing - LLM filters relevant tools, keeps prompts lean
- Stream-first - Watch agents think in real-time
- Plug-and-play - Drop in new tools, they just work
Examples
Hello World
import asyncio
from cogency import Agent
async def main():
agent = Agent("assistant")
result = await agent.run("What is 25 * 43?", mode="summary")
print(result) # 1075
asyncio.run(main())
With Tools
import asyncio
from cogency import Agent, WeatherTool
async def main():
agent = Agent("weather_assistant", tools=[WeatherTool()])
result = await agent.run("What's the weather in San Francisco?", mode="summary")
print(result)
asyncio.run(main())
Streaming Reasoning
import asyncio
from cogency import Agent, CalculatorTool, WebSearchTool
async def main():
agent = Agent("analyst", tools=[CalculatorTool(), WebSearchTool()])
# Stream with trace mode for full visibility
async for chunk in agent.stream("Find Bitcoin price and calculate value of 0.5 BTC", mode="trace"):
print(chunk, end="", flush=True)
asyncio.run(main())
Custom Tools
from cogency import Agent, BaseTool
class TimezoneTool(BaseTool):
def __init__(self):
super().__init__("timezone", "Get time in any city")
async def run(self, city: str):
return {"time": f"Current time in {city}: 14:30 PST"}
def get_schema(self):
return "timezone(city='string')"
agent = Agent("time_assistant", tools=[TimezoneTool()])
Installation
pip install cogency
# Set your API keys
echo "OPENAI_API_KEY=sk-..." >> .env
echo "ANTHROPIC_API_KEY=sk-ant-..." >> .env
echo "GEMINI_API_KEY=your-key-here" >> .env
# Agent auto-detects from any available key
Supported Providers
LLMs (Auto-detected):
- OpenAI (GPT-4, GPT-3.5)
- Anthropic (Claude)
- Google (Gemini)
- xAI (Grok)
- Mistral
Embeddings (Auto-detected):
- OpenAI (text-embedding-3)
- Nomic (nomic-embed-text)
- Sentence Transformers (local)
Built-in Tools:
- Calculator - Math operations
- Weather - Real weather data (no API key)
- Timezone - World time (no API key)
- WebSearch - Internet search
- FileManager - File operations
PRARR Architecture
Cogency uses Plan-Reason-Act-Reflect-Respond for transparent multi-step reasoning:
📋 PLAN → Analyze request, filter relevant tools
🧠 REASON → Determine tool usage strategy
⚡ ACT → Execute tools, gather results
🔍 REFLECT → Filter out bullshit, focus results
💬 RESPOND → Generate clean final answer
Every step is streamable and traceable.
Output Examples
Output Modes
Cogency supports three output modes:
Summary Mode (Default): Clean final answer only
result = await agent.run("What's 15 * 23?", mode="summary")
print(result) # "345"
Trace Mode: Beautiful execution trace + answer
result = await agent.run("What's 15 * 23?", mode="trace")
# Outputs:
# 🚀 EXECUTION TRACE (450ms total)
# ==================================================
# 🔸 PLAN [14:30:15] 120ms
# 📥 'What's 15 * 23?'
# 📤 Decision: tool_needed
#
# 🔸 ACT [14:30:15] 200ms
# 📥 calculator(expression="15 * 23")
# 📤 Result: 345
# ==================================================
# ✅ Final: 345
Dev Mode: Raw state dumps for debugging
result = await agent.run("What's 15 * 23?", mode="dev")
# Outputs full AgentState with all internal data
Streaming: Real-time output with any mode
async for chunk in agent.stream("Calculate something", mode="trace"):
print(chunk, end="", flush=True)
Key Features
- Auto-detection - Zero config provider setup
- Tool subsetting - Intelligent filtering keeps prompts lean
- Key rotation - Load balance across multiple API keys
- Result filtering - Remove execution metadata in REFLECT
- Stream transparency - Watch reasoning in real-time
- Beautiful traces - Human-readable execution logs
- Plug-and-play - Drop in tools, they auto-register
Contributing
Framework designed for extension:
# Add new LLM provider
class YourLLM(ProviderMixin, BaseLLM):
async def invoke(self, messages, **kwargs):
# Your implementation
# Add new tool
class YourTool(BaseTool):
async def run(self, **params):
# Your implementation
That's it. Auto-discovery handles the rest.
License
MIT - Build whatever you want.
Cogency: AI agents without the ceremony.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cogency-0.4.0.tar.gz.
File metadata
- Download URL: cogency-0.4.0.tar.gz
- Upload date:
- Size: 61.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
656f49ee81a6e118eba62894e48da527772a328f6986d99e39061049947c068f
|
|
| MD5 |
92cab6ff3365c99553432aab01a363ef
|
|
| BLAKE2b-256 |
41e4ddfce327a5747768a90ad50f5ac1fc49a6d3977b284789cd8e06ce2a4b1c
|
File details
Details for the file cogency-0.4.0-py3-none-any.whl.
File metadata
- Download URL: cogency-0.4.0-py3-none-any.whl
- Upload date:
- Size: 86.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
14d9528235a517e87b4fc812f433ea7c2f2112c7c248dfd3a6d910b3befc604c
|
|
| MD5 |
f3c099cc3280370ba765e6505860ccc0
|
|
| BLAKE2b-256 |
addf3aef3a7e0804e9d3d739c3eddfc9f80e53a40c3698a13d8edf0a7783f656
|