Streaming agents
Project description
Cogency
Streaming agents with stateless context assembly.
Install
pip install cogency
export OPENAI_API_KEY="your-key"
Quickstart
from cogency import Agent
agent = Agent(llm="openai")
async for event in agent("What files are in this directory?"):
if event["type"] == "respond":
print(event["content"])
Core Design
- Persist-then-rebuild: Events written to storage immediately, context rebuilt each execution
- Protocol/storage separation: XML delimiters for LLM I/O, clean events in storage
- Stateless execution: Agent and context are pure functions, all state in storage
Result: no state corruption, crash recovery, concurrent safety.
Execution Modes
| Mode | Method | Token Usage | Providers |
|---|---|---|---|
| Resume | WebSocket | Constant | OpenAI, Gemini |
| Replay | HTTP | Grows with conversation | All |
| Auto | WebSocket with HTTP fallback | Optimal | All |
agent = Agent(llm="openai", mode="auto") # Default
Token efficiency (Resume vs Replay):
| Turns | Replay | Resume | Savings |
|---|---|---|---|
| 16 | 100,800 | 10,800 | 9.3x |
| 32 | 355,200 | 20,400 | 17.4x |
Streaming
Event mode (default): Complete semantic units
async for event in agent("Debug this code", stream="event"):
if event["type"] == "think":
print(f"~ {event['content']}")
elif event["type"] == "respond":
print(f"> {event['content']}")
Token mode: Real-time streaming
async for event in agent("Debug this code", stream="token"):
if event["type"] == "respond":
print(event["content"], end="", flush=True)
Conversations
Stateless (default):
async for event in agent("What's in this directory?"):
if event["type"] == "respond":
print(event["content"])
Stateful with profile learning:
async for event in agent(
"Continue our code review",
conversation_id="review_session",
user_id="developer" # For profile learning and multi-tenancy
):
if event["type"] == "respond":
print(event["content"])
Built-in Tools
| Tool | Description |
|---|---|
read |
Read file (with optional pagination) |
write |
Write file (overwrite protection) |
edit |
Replace exact text in file |
list |
Tree view of directory |
find |
Find files by pattern or content |
replace |
Find-and-replace across files |
shell |
Execute shell command |
search |
Web search |
scrape |
Extract webpage text |
recall |
Search past conversations |
Custom Tools
from dataclasses import dataclass
from typing import Annotated
from cogency import ToolResult
from cogency.core.tool import tool
from cogency.core.protocols import ToolParam
@dataclass
class QueryParams:
sql: Annotated[str, ToolParam(description="SQL query")]
@tool("Execute SQL queries")
async def query_db(params: QueryParams, **kwargs) -> ToolResult:
result = db.execute(params.sql)
return ToolResult(outcome="Query executed", content=result)
agent = Agent(llm="openai", tools=[query_db])
Configuration
agent = Agent(
llm="openai", # or "gemini", "anthropic"
mode="auto", # "resume", "replay", or "auto"
storage=custom_storage, # Custom Storage implementation
identity="Custom agent identity",
instructions="Additional context",
tools=[CustomTool()],
max_iterations=10,
history_window=None, # None = full history, int = sliding window
history_transform=compress, # Optional history compression callable
profile=True, # Enable automatic user learning
security=Security(access="project", shell_timeout=60), # Security policies
notifications=notification_source, # Mid-execution context injection
debug=False
)
History compression: For long conversations, pass history_transform to compress context:
async def compress(messages: list[dict]) -> list[dict]:
if len(messages) <= 20:
return messages
return [{"role": "system", "content": f"[{len(messages)-10} earlier messages]"}] + messages[-10:]
agent = Agent(llm="openai", history_transform=compress)
Documentation
- architecture.md - Core pipeline and design decisions
- execution.md - Tool execution protocol specification
- protocol.md - Wire format, event stream, storage
- tools.md - Built-in tool reference
- memory.md - Profile, recall, history window
- proof.md - Mathematical efficiency analysis
License
Apache 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cogency-3.3.0.tar.gz.
File metadata
- Download URL: cogency-3.3.0.tar.gz
- Upload date:
- Size: 224.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.18 {"installer":{"name":"uv","version":"0.9.18","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a47447d80a7b79d24ec08d7e4db8dff445b7622f7a4c53ff9c062998a5a2cb4a
|
|
| MD5 |
6e46b345155b4e79aef5e2437850670a
|
|
| BLAKE2b-256 |
3975230e4bcc21e7f5d26a404b5e2865da85beb9fd7d59612549f8b98529a06b
|
File details
Details for the file cogency-3.3.0-py3-none-any.whl.
File metadata
- Download URL: cogency-3.3.0-py3-none-any.whl
- Upload date:
- Size: 67.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.18 {"installer":{"name":"uv","version":"0.9.18","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2546a7eb7e314c1d599c203be87b68af13a7ec608bfd5e1f565041619c67c51f
|
|
| MD5 |
26bd0b9ef665dbf621063048c9767bd2
|
|
| BLAKE2b-256 |
9574d2f310c59cc5d2e0c385e361e7adb572436f5bcded138d05638f1a601619
|