Skip to main content

A reasoning engine for adaptive AI agents

Project description

Cogency

PyPI version License Python 3.10+

A reasoning engine for adaptive AI agents.

from cogency import Agent
agent = Agent("assistant")

# Simple task → direct response
agent.run("What's 2+2?")

# Complex task → adaptive reasoning
agent.run("Analyze this codebase and suggest architectural improvements")
# Automatically escalates reasoning depth and tool usage

Why Cogency?

Zero ceremony, maximum capability - get production-ready agents from a single import.

  • 🔒 Semantic security - Built-in safety, blocks unsafe requests automatically
  • ⚡ Adaptive reasoning - Thinks fast for simple queries, deep for complex tasks
  • 🛠️ Smart tooling - Tools auto-register and route intelligently
  • 🧠 Built-in memory - Persistent context that actually learns about users
  • 🏗️ Production ready - Resilience, tracing, and error recovery out of the box

Get Started in 30 Seconds

pip install cogency
export OPENAI_API_KEY=...
from cogency import Agent

agent = Agent("assistant")
result = agent.run("What's in the current directory?")
print(result)

That's it. No configuration, no setup, no tool registration. Just working agents.

What Makes It Different

Semantic Security

agent.run("rm -rf /")  # ❌ Blocked automatically
agent.run("List files safely")  # ✅ Proceeds normally

Adaptive Intelligence

agent.run("What's 2+2?")  # Fast: Direct response
agent.run("Analyze my codebase")  # Deep: Multi-step reasoning

Memory That Actually Works

agent = Agent("assistant", memory=True)
agent.run("I prefer Python and work at Google")
agent.run("What language should I use?")  # → "Python"

Built-in Capabilities

Tools that just work: 📁 Files - Read, write, edit any file
💻 Shell - Execute commands safely
🌐 HTTP - API calls and requests
📖 Scrape - Extract web content
🔍 Search - Web search via DuckDuckGo

Plus add your own:

@tool
class DatabaseTool(Tool):
    async def run(self, query: str):
        return await db.execute(query)

# Automatically available to all agents

Universal LLM Support

Works with any LLM - just set the API key:

# OpenAI
export OPENAI_API_KEY=sk-...

# Anthropic  
export ANTHROPIC_API_KEY=sk-ant-...

# Google Gemini
export GEMINI_API_KEY=...

# Mistral
export MISTRAL_API_KEY=...

# OpenRouter (cost-effective)
export OPENROUTER_API_KEY=sk-or-v1-...

# Groq (high-performance) 
export GROQ_API_KEY=gsk_...

# Ollama (local models)
export OLLAMA_API_KEY=...

No configuration needed - Cogency detects and configures automatically.

Production Features

Streaming responses:

async for chunk in agent.stream("Analyze this large codebase"):
    print(chunk, end="")

Full observability:

result = agent.run("Deploy my app")
logs = agent.logs()  # See exactly what happened
print(logs)  # ["🔧 triage: selected 2 tools", "💻 shell: deploying...", ...]

Error resilience:

# Tool failures don't crash execution
agent.run("List files in /nonexistent")  # → Graceful error handling
# API timeouts auto-retry with backoff
# Memory failures don't block responses

Advanced Usage

# Full customization when needed
agent = Agent(
    "assistant",
    memory=True,              # Persistent user context
    tools=["files", "shell"],  # Specific tools only  
    max_iterations=20,        # Deep reasoning limit
    debug=True               # Detailed execution logs
)

# Custom memory configuration
from cogency.config import MemoryConfig
agent = Agent("assistant", memory=MemoryConfig(threshold=8000))

# Custom event handlers
agent = Agent("assistant", handlers=[websocket_handler])

Documentation

  • Quick Start - Get running in 5 minutes
  • API Reference - Complete Agent class documentation
  • Tools - Built-in tools and custom tool creation
  • Examples - Detailed code examples and walkthroughs
  • Memory - Memory system documentation
  • Reasoning - Adaptive reasoning modes

License

Apache 2.0

Support

Built for developers who want agents that just work.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cogency-1.2.0.tar.gz (98.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cogency-1.2.0-py3-none-any.whl (138.5 kB view details)

Uploaded Python 3

File details

Details for the file cogency-1.2.0.tar.gz.

File metadata

  • Download URL: cogency-1.2.0.tar.gz
  • Upload date:
  • Size: 98.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0

File hashes

Hashes for cogency-1.2.0.tar.gz
Algorithm Hash digest
SHA256 c11e6c3a34c32352299f6edf0ac74d126248a85b4cfe5ec6027dbad4b2f94ebf
MD5 a407a13419e752c62939719e152df8f2
BLAKE2b-256 8e854459fdfe4a5711eb2ae90fde6b80845b3eb119ab04747ffad31e39455156

See more details on using hashes here.

File details

Details for the file cogency-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: cogency-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 138.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0

File hashes

Hashes for cogency-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fa1854ce087b89566e3b6cf318675c3dec3c6190e61cca22621ca85f9573b5af
MD5 251141bde6e94116d2446bd0a86412da
BLAKE2b-256 f5f2c85cee65e9aed4383f1a73d03e70b6648ae05add679dca7382bea639f809

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page