Skip to main content

Stateless Context-Driven Agent Framework

Project description

Cogency: Stateless Context-Driven Agent Framework

Context injection + LLM inference = complete reasoning engine.

After extensive research (340 commits), we discovered agents work better as functions.

Quick Start

import asyncio
from cogency import Agent

async def main():
    agent = Agent()
    response = await agent("What are the benefits of async/await in Python?")
    print(response)

# Run with: python -m asyncio your_script.py
asyncio.run(main())

Installation

pip install cogency

Set your OpenAI API key:

export OPENAI_API_KEY="your-api-key-here"

Examples

Basic Agent

from cogency import Agent

agent = Agent()
response = await agent("Explain quantum computing in simple terms")

ReAct Agent with Tools

from cogency import ReAct

agent = ReAct(verbose=True)
result = await agent.solve("Create a Python script that calculates factorial of 10")
print(result["final_answer"])

User-Specific Context

from cogency import Agent, profile

# Set user preferences (optional)
profile("alice", 
        name="Alice Johnson",
        preferences=["Python", "Machine Learning"],
        context="Senior data scientist working on NLP projects")

agent = Agent()
response = await agent("Recommend a good ML library for text processing", user_id="alice")

Custom Knowledge Base

from cogency.storage import add_document

# Add documents to knowledge base (optional)
add_document("python_guide", "Python is a high-level programming language...")
add_document("ml_basics", "Machine learning is a subset of artificial intelligence...")

# Agent automatically searches relevant documents for context
agent = Agent()
response = await agent("What's the difference between Python and machine learning?")

Architecture

Context-driven agents work by injecting relevant information before each query:

async def agent_call(query: str, user_id: str = "default") -> str:
    ctx = context(query, user_id)  # Assembles relevant context
    prompt = f"{ctx}\n\nQuery: {query}"
    return await llm.generate(prompt)

Context sources include:

  • System: Base instructions
  • Conversation: Recent message history
  • Knowledge: Semantic search results
  • Memory: User profile and preferences
  • Working: Tool execution history (for ReAct agents)

Design Principles

  • Zero writes during reasoning - no database operations in the hot path
  • Pure functions for context assembly - deterministic and testable
  • Read-only context sources - graceful degradation on failures
  • Optional persistence - conversation history saved asynchronously

API Reference

Agent

Simple conversational agent with context injection.

agent = Agent()
response = await agent(query: str, user_id: str = "default") -> str

ReAct

Tool-using agent with Reason + Act loops.

agent = ReAct(tools=None, user_id="default", verbose=False)
result = await agent.solve(task: str, max_iterations: int = 5) -> dict

Context Functions

from cogency import profile
from cogency.storage import add_document

# User profiles
profile(user_id, name=None, preferences=None, context=None)

# Knowledge base
add_document(doc_id: str, content: str, metadata: dict = None)

Testing

# Install dev dependencies
poetry install

# Run tests
pytest tests/

Documentation

See docs/blueprint.md for complete technical specification.

v2.0.0 represents a complete architectural rewrite based on empirical evidence that simpler approaches work better for LLM-based reasoning systems.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cogency-2.0.0.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cogency-2.0.0-py3-none-any.whl (19.7 kB view details)

Uploaded Python 3

File details

Details for the file cogency-2.0.0.tar.gz.

File metadata

  • Download URL: cogency-2.0.0.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0

File hashes

Hashes for cogency-2.0.0.tar.gz
Algorithm Hash digest
SHA256 65d9ec9f531087eacd295f50473c40b29b2b7eaec351f0e5ea5aa5436500181b
MD5 8e6c54c51df165a85dad38917ccc602b
BLAKE2b-256 5bd5d5bc7ce3c981a53f392def55844aa4d4ac717f495bc3b1b82cf58701e3d8

See more details on using hashes here.

File details

Details for the file cogency-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: cogency-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 19.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.12.10 Darwin/24.5.0

File hashes

Hashes for cogency-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5c49776f7bc9a3b637f5fe172ae7a0d1de2dc89f8b40d56b41438c510dbe1e55
MD5 7f61e9a13f0b1af43697f9934550e38e
BLAKE2b-256 0504148756baf23a99204a0cf80394d6b953675bbc42b8eab6a64e295735092f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page