A reasoning engine for adaptive AI agents
Project description
Cogency
A reasoning engine for adaptive AI agents.
from cogency import Agent
agent = Agent("assistant")
# Simple task → direct response
agent.run("What's 2+2?")
# Complex task → adaptive reasoning
agent.run("Analyze this codebase and suggest architectural improvements")
# Automatically escalates reasoning depth and tool usage
Why Cogency?
Zero ceremony, maximum capability - get production-ready agents from a single import.
- 🔒 Semantic security - Built-in safety, blocks unsafe requests automatically
- ⚡ Adaptive reasoning - Thinks fast for simple queries, deep for complex tasks
- 🛠️ Smart tooling - Tools auto-register and route intelligently
- 🧠 Built-in memory - Persistent context that actually learns about users
- 🏗️ Production ready - Resilience, tracing, and error recovery out of the box
Get Started in 30 Seconds
pip install cogency
export OPENAI_API_KEY=...
from cogency import Agent
agent = Agent("assistant")
result = agent.run("What's in the current directory?")
print(result)
That's it. No configuration, no setup, no tool registration. Just working agents.
What Makes It Different
Semantic Security
agent.run("rm -rf /") # ❌ Blocked automatically
agent.run("List files safely") # ✅ Proceeds normally
Adaptive Intelligence
agent.run("What's 2+2?") # Fast: Direct response
agent.run("Analyze my codebase") # Deep: Multi-step reasoning
Memory That Actually Works
agent = Agent("assistant", memory=True)
agent.run("I prefer Python and work at Google")
agent.run("What language should I use?") # → "Python"
Built-in Capabilities
Tools that just work:
📁 Files - Read, write, edit any file
💻 Shell - Execute commands safely
🌐 HTTP - API calls and requests
📖 Scrape - Extract web content
🔍 Search - Web search via DuckDuckGo
Plus add your own:
@tool
class DatabaseTool(Tool):
async def run(self, query: str):
return await db.execute(query)
# Automatically available to all agents
Universal LLM Support
Works with any LLM - just set the API key:
# OpenAI
export OPENAI_API_KEY=sk-...
# Anthropic
export ANTHROPIC_API_KEY=sk-ant-...
# Google Gemini
export GEMINI_API_KEY=...
# Mistral
export MISTRAL_API_KEY=...
# OpenRouter (cost-effective)
export OPENROUTER_API_KEY=sk-or-v1-...
# Groq (high-performance)
export GROQ_API_KEY=gsk_...
# Ollama (local models)
export OLLAMA_API_KEY=...
No configuration needed - Cogency detects and configures automatically.
Production Features
Streaming responses:
async for chunk in agent.stream("Analyze this large codebase"):
print(chunk, end="")
Full observability:
result = agent.run("Deploy my app")
logs = agent.logs() # See exactly what happened
print(logs) # ["🔧 triage: selected 2 tools", "💻 shell: deploying...", ...]
Error resilience:
# Tool failures don't crash execution
agent.run("List files in /nonexistent") # → Graceful error handling
# API timeouts auto-retry with backoff
# Memory failures don't block responses
Advanced Usage
# Full customization when needed
agent = Agent(
"assistant",
memory=True, # Persistent user context
tools=["files", "shell"], # Specific tools only
max_iterations=20, # Deep reasoning limit
debug=True # Detailed execution logs
)
# Custom memory configuration
from cogency.config import MemoryConfig
agent = Agent("assistant", memory=MemoryConfig(threshold=8000))
# Custom event handlers
agent = Agent("assistant", handlers=[websocket_handler])
Documentation
- Quick Start - Get running in 5 minutes
- API Reference - Complete Agent class documentation
- Tools - Built-in tools and custom tool creation
- Examples - Detailed code examples and walkthroughs
- Memory - Memory system documentation
- Reasoning - Adaptive reasoning modes
License
Apache 2.0
Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
Built for developers who want agents that just work.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cogency-1.2.1.tar.gz.
File metadata
- Download URL: cogency-1.2.1.tar.gz
- Upload date:
- Size: 97.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d7f064b102bff7793b70d7e6caf768fc233645ef2d20bfdb552e1b167882111f
|
|
| MD5 |
1acc27382e271fe470ecba102e01263d
|
|
| BLAKE2b-256 |
235aea7e4c72531480ccd708a71ad7a4f9f80417ee422e69ff4a66a6765cf3c2
|
File details
Details for the file cogency-1.2.1-py3-none-any.whl.
File metadata
- Download URL: cogency-1.2.1-py3-none-any.whl
- Upload date:
- Size: 135.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cf2c8893a9554fe06391e75ed11b077f7fdc80f1bb1fa6c2cbbf79d1885256c7
|
|
| MD5 |
ae5e5191cd548bf58dcb26c2c9101647
|
|
| BLAKE2b-256 |
a5a0931cd8d49c03d49503663507cad0c0624302c4bf95b2acc8988957290a9b
|