Skip to main content

Agentic AI out of the box

Project description

Cogency

3-line AI agents that just work

Cogency is a multistep reasoning framework that makes building AI agents stupidly simple. Auto-detects providers, intelligently routes tools, streams transparent reasoning.

Installation

pip install cogency

Quick Start

import asyncio
from cogency import Agent

async def main():
    # That's it. Auto-detects LLM from .env
    agent = Agent("assistant")
    
    # Beautiful streaming ReAct reasoning (NEW in 0.4.1!)
    await agent.run_streaming("What is 25 * 43?")
    # Shows: 🧠 REASON → ⚡ ACT → 👀 OBSERVE → 💬 RESPOND

asyncio.run(main())

Requirements: Python 3.9+, API key in .env

Core Philosophy

  • Zero ceremony - 3 lines to working agent
  • Magical auto-detection - Detects OpenAI, Anthropic, Gemini, Grok, Mistral from environment
  • Intelligent tool routing - LLM filters relevant tools, keeps prompts lean
  • Stream-first - Watch agents think in real-time
  • Plug-and-play - Drop in new tools, they just work

Examples

Hello World

import asyncio
from cogency import Agent

async def main():
    agent = Agent("assistant")
    result = await agent.run("What is 25 * 43?", mode="summary")
    print(result)  # 1075

asyncio.run(main())

With Tools

import asyncio
from cogency import Agent, WeatherTool

async def main():
    agent = Agent("weather_assistant", tools=[WeatherTool()])
    result = await agent.run("What's the weather in San Francisco?", mode="summary")
    print(result)

asyncio.run(main())

Streaming Reasoning

import asyncio
from cogency import Agent, CalculatorTool, WebSearchTool

async def main():
    agent = Agent("analyst", tools=[CalculatorTool(), WebSearchTool()])

    # Stream with trace mode for full visibility
    async for chunk in agent.stream("Find Bitcoin price and calculate value of 0.5 BTC", mode="trace"):
        print(chunk, end="", flush=True)

asyncio.run(main())

Custom Tools

from cogency import Agent, BaseTool

class TimezoneTool(BaseTool):
    def __init__(self):
        super().__init__("timezone", "Get time in any city")
    
    async def run(self, city: str):
        return {"time": f"Current time in {city}: 14:30 PST"}
    
    def get_schema(self):
        return "timezone(city='string')"

agent = Agent("time_assistant", tools=[TimezoneTool()])

Installation

pip install cogency

# Set your API keys
echo "OPENAI_API_KEY=sk-..." >> .env
echo "ANTHROPIC_API_KEY=sk-ant-..." >> .env
echo "GEMINI_API_KEY=your-key-here" >> .env
# Agent auto-detects from any available key

Supported Providers

LLMs (Auto-detected):

  • OpenAI (GPT-4, GPT-3.5)
  • Anthropic (Claude)
  • Google (Gemini)
  • xAI (Grok)
  • Mistral

Embeddings (Auto-detected):

  • OpenAI (text-embedding-3)
  • Nomic (nomic-embed-text)
  • Sentence Transformers (local)

Built-in Tools:

  • Calculator - Math operations
  • Weather - Real weather data (no API key)
  • Timezone - World time (no API key)
  • WebSearch - Internet search
  • FileManager - File operations

PRARR Architecture

Cogency uses Plan-Reason-Act-Reflect-Respond for transparent multi-step reasoning:

📋 PLAN    → Analyze request, filter relevant tools
🧠 REASON  → Determine tool usage strategy  
⚡ ACT     → Execute tools, gather results
🔍 REFLECT → Filter out bullshit, focus results
💬 RESPOND → Generate clean final answer

Every step is streamable and traceable.

Output Examples

Output Modes

Cogency supports three output modes:

Summary Mode (Default): Clean final answer only

result = await agent.run("What's 15 * 23?", mode="summary")
print(result)  # "345"

Trace Mode: Beautiful execution trace + answer

result = await agent.run("What's 15 * 23?", mode="trace")
# Outputs:
# 🚀 EXECUTION TRACE (450ms total)
# ==================================================
# 🔸 PLAN [14:30:15] 120ms
#    📥 'What's 15 * 23?'
#    📤 Decision: tool_needed
#
# 🔸 ACT [14:30:15] 200ms
#    📥 calculator(expression="15 * 23")
#    📤 Result: 345
# ==================================================
# ✅ Final: 345

Dev Mode: Raw state dumps for debugging

result = await agent.run("What's 15 * 23?", mode="dev")
# Outputs full AgentState with all internal data

Streaming: Real-time output with any mode

async for chunk in agent.stream("Calculate something", mode="trace"):
    print(chunk, end="", flush=True)

Key Features

  • Auto-detection - Zero config provider setup
  • Tool subsetting - Intelligent filtering keeps prompts lean
  • Key rotation - Load balance across multiple API keys
  • Result filtering - Remove execution metadata in REFLECT
  • Stream transparency - Watch reasoning in real-time
  • Beautiful traces - Human-readable execution logs
  • Plug-and-play - Drop in tools, they auto-register

Contributing

Framework designed for extension:

# Add new LLM provider
class YourLLM(ProviderMixin, BaseLLM):
    async def invoke(self, messages, **kwargs):
        # Your implementation
        
# Add new tool  
class YourTool(BaseTool):
    async def run(self, **params):
        # Your implementation

That's it. Auto-discovery handles the rest.

License

MIT - Build whatever you want.


Cogency: AI agents without the ceremony.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cogency-0.4.1.tar.gz (62.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cogency-0.4.1-py3-none-any.whl (86.6 kB view details)

Uploaded Python 3

File details

Details for the file cogency-0.4.1.tar.gz.

File metadata

  • Download URL: cogency-0.4.1.tar.gz
  • Upload date:
  • Size: 62.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0

File hashes

Hashes for cogency-0.4.1.tar.gz
Algorithm Hash digest
SHA256 98787de6180b6cabf6183a77cfe672010ffe0175127a2abfbf77dfdf5b4d5498
MD5 c6c886a41e3d0f6b240b770d0b4641c3
BLAKE2b-256 b9e05e3de6954988a7ed6d4da8347811cec69cda31a0be8091cc02fc045837a4

See more details on using hashes here.

File details

Details for the file cogency-0.4.1-py3-none-any.whl.

File metadata

  • Download URL: cogency-0.4.1-py3-none-any.whl
  • Upload date:
  • Size: 86.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0

File hashes

Hashes for cogency-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 26c45fbac0b163ddb5b1de9d56d5ecfa8fda245fc7c9fd36382d57d0624e3d30
MD5 c78ac5908dce0bcef5d5a3e04be41802
BLAKE2b-256 86aede127eb4c7c9e233394d7d98c27d1f827b67ae41bd338392fa185a5ad4df

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page