Skip to main content

A reasoning engine for adaptive AI agents

Project description

Cogency

PyPI version License Python 3.10+

A reasoning engine for adaptive AI agents.

from cogency import Agent
agent = Agent("assistant")

# Simple task → direct response
agent.run("What's 2+2?")

# Complex task → adaptive reasoning
agent.run("Analyze this codebase and suggest architectural improvements")
# Automatically escalates reasoning depth and tool usage

Why Cogency?

Zero ceremony, maximum capability - get production-ready agents from a single import.

  • 🔒 Semantic security - Built-in safety, blocks unsafe requests automatically
  • ⚡ Adaptive reasoning - Thinks fast for simple queries, deep for complex tasks
  • 🛠️ Smart tooling - Tools auto-register and route intelligently
  • 🧠 Built-in memory - Persistent context that actually learns about users
  • 🏗️ Production ready - Resilience, tracing, and error recovery out of the box

Get Started in 30 Seconds

pip install cogency
export OPENAI_API_KEY=...
from cogency import Agent

agent = Agent("assistant")
result = agent.run("What's in the current directory?")
print(result)

That's it. No configuration, no setup, no tool registration. Just working agents.

What Makes It Different

Semantic Security

agent.run("rm -rf /")  # ❌ Blocked automatically
agent.run("List files safely")  # ✅ Proceeds normally

Adaptive Intelligence

agent.run("What's 2+2?")  # Fast: Direct response
agent.run("Analyze my codebase")  # Deep: Multi-step reasoning

Memory That Actually Works

agent = Agent("assistant", memory=True)
agent.run("I prefer Python and work at Google")
agent.run("What language should I use?")  # → "Python"

Built-in Capabilities

Tools that just work: 📁 Files - Read, write, edit any file
💻 Shell - Execute commands safely
🌐 HTTP - API calls and requests
📖 Scrape - Extract web content
🔍 Search - Web search via DuckDuckGo

Plus add your own:

@tool
class DatabaseTool(Tool):
    async def run(self, query: str):
        return await db.execute(query)

# Automatically available to all agents

Universal LLM Support

Works with any LLM - just set the API key:

# OpenAI
export OPENAI_API_KEY=sk-...

# Anthropic  
export ANTHROPIC_API_KEY=sk-ant-...

# Google Gemini
export GEMINI_API_KEY=...

# Mistral
export MISTRAL_API_KEY=...

# OpenRouter (cost-effective)
export OPENROUTER_API_KEY=sk-or-v1-...

# Groq (high-performance) 
export GROQ_API_KEY=gsk_...

# Ollama (local models)
export OLLAMA_API_KEY=...

No configuration needed - Cogency detects and configures automatically.

Production Features

Streaming responses:

async for chunk in agent.stream("Analyze this large codebase"):
    print(chunk, end="")

Full observability:

result = agent.run("Deploy my app")
logs = agent.logs()  # See exactly what happened
print(logs)  # ["🔧 triage: selected 2 tools", "💻 shell: deploying...", ...]

Error resilience:

# Tool failures don't crash execution
agent.run("List files in /nonexistent")  # → Graceful error handling
# API timeouts auto-retry with backoff
# Memory failures don't block responses

Advanced Usage

# Full customization when needed
agent = Agent(
    "assistant",
    memory=True,              # Persistent user context
    tools=["files", "shell"],  # Specific tools only  
    max_iterations=20,        # Deep reasoning limit
    debug=True               # Detailed execution logs
)

# Custom memory configuration
from cogency.config import MemoryConfig
agent = Agent("assistant", memory=MemoryConfig(threshold=8000))

# Custom event handlers
agent = Agent("assistant", handlers=[websocket_handler])

Documentation

  • Quick Start - Get running in 5 minutes
  • API Reference - Complete Agent class documentation
  • Tools - Built-in tools and custom tool creation
  • Examples - Detailed code examples and walkthroughs
  • Memory - Memory system documentation
  • Reasoning - Adaptive reasoning modes

License

Apache 2.0

Support

Built for developers who want agents that just work.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cogency-1.2.2.tar.gz (95.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cogency-1.2.2-py3-none-any.whl (133.4 kB view details)

Uploaded Python 3

File details

Details for the file cogency-1.2.2.tar.gz.

File metadata

  • Download URL: cogency-1.2.2.tar.gz
  • Upload date:
  • Size: 95.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0

File hashes

Hashes for cogency-1.2.2.tar.gz
Algorithm Hash digest
SHA256 9ed4fce7e4f4b03f7e3b8c53da5ed9c3bac411a88a211df0ac5406ab08e869e7
MD5 d0765dc076aef7f5e11f776e271d88f6
BLAKE2b-256 d42c9f1915734b53323bc8286730c90784a4b34d2e7d5ff43c1747623f8b4e9d

See more details on using hashes here.

File details

Details for the file cogency-1.2.2-py3-none-any.whl.

File metadata

  • Download URL: cogency-1.2.2-py3-none-any.whl
  • Upload date:
  • Size: 133.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0

File hashes

Hashes for cogency-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 96884e79776b6f769ff0ae6de3a35324d7628eae76fdc2c56f8cddde9b1c1586
MD5 994ddf45bd98c8b556e71c21699fd491
BLAKE2b-256 6eaa2892c34ef74ec205c0c1eca7978ba99855e4da8a6c912883fa9f81bd7376

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page