LLM agent framework with structured I/O
Project description
LLM agent framework with structured I/O
Build AI agents with type-safe inputs and outputs, automatic tool calling, and powerful agentic loops.
✨ Features
- 🎯 Structured I/O - Pydantic models for inputs and outputs
- 🤖 Agentic Loops - Multi-turn execution with tool calling
- 🛠️ Auto Tool Schemas - Generate from type hints and docstrings
- 🔄 Dynamic Tools - Add/remove tools during execution
- ✅ Parse Error Recovery - Automatic retry on validation failures
- 📊 Step Callbacks - Full control over loop behavior
- 🔌 LiteLLM Integration - Works with any LLM provider
- 🌊 Streaming Responses - Real-time output with partial structured updates
- 💾 Provider Caching - Reduce latency and cost with prompt caching
- 🛡️ Model Fallbacks - Automatic provider failover for high availability
- 🌳 Branching Workflows - Spawn sub-agents that extend parent capabilities for parallel analysis and map-reduce patterns
🚀 Quick Start
Installation
pip install acorn
Set your API key:
# For Anthropic Claude
export ANTHROPIC_API_KEY="your-key-here"
# Or for OpenAI
export OPENAI_API_KEY="your-key-here"
# Or any other LiteLLM-supported provider
Single-Turn Example
from pydantic import BaseModel, Field
from acorn import Module
class Input(BaseModel):
text: str = Field(description="The text to summarize")
max_words: int = Field(default=100, description="Maximum words in summary")
class Output(BaseModel):
summary: str = Field(description="The concise summary")
word_count: int = Field(description="Number of words in summary")
class Summarizer(Module):
"""Summarize text concisely."""
initial_input = Input
final_output = Output
model = "anthropic/claude-sonnet-4-5-20250514"
# Use it
summarizer = Summarizer()
result = summarizer(
text="Long article text here...",
max_words=50
)
print(result.summary)
print(f"Words: {result.word_count}")
Multi-Turn Agentic Loop
from pydantic import BaseModel, Field
from acorn import Module, tool
class Input(BaseModel):
topic: str = Field(description="Research topic")
depth: str = Field(default="shallow", description="Research depth")
class Output(BaseModel):
findings: str = Field(description="Summary of findings")
sources: list[str] = Field(description="Sources consulted")
class ResearchAgent(Module):
"""Research assistant with tools."""
initial_input = Input
max_steps = 5 # Enable agentic loop
final_output = Output
model = "anthropic/claude-sonnet-4-5-20250514"
@tool
def search(self, query: str) -> list:
"""Search for information."""
# Your search implementation
return ["result1", "result2"]
@tool
def analyze(self, data: str) -> str:
"""Analyze collected data."""
# Your analysis implementation
return f"Analysis: {data}"
def on_step(self, step):
"""Called after each step."""
print(f"Step {step.counter}")
# Early termination if done
if len(step.tool_results) >= 3:
step.finish(
findings="Sufficient data collected",
sources=["source1", "source2"]
)
return step
# Use it
agent = ResearchAgent()
result = agent(topic="Large Language Models", depth="shallow")
📖 Documentation
- Getting Started - Installation and first steps
- Module Reference - Complete Module API documentation
- Branching - Sub-agents and parallel processing
📚 Core Concepts
Module
Base class for LLM agents. Configure with:
model- LLM to use (required - no default)temperature- Sampling temperaturemax_tokens- Maximum tokens to generatemax_steps- Max agentic loop iterations (None = single-turn)initial_input- Pydantic model for input schemafinal_output- Pydantic model for output schematools- List of available toolscache- Enable provider-level prompt cachingmodel_fallbacks- List of fallback models for automatic failover
Tools
Functions the LLM can call:
@tool
def search(query: str, limit: int = 10) -> list:
"""Search for information.
Args:
query: The search query
limit: Maximum results to return
"""
return search_api(query, limit)
Schema is automatically generated from type hints and docstring.
Step Callback
Control agentic loop execution:
def on_step(self, step):
# Access step info
print(f"Step {step.counter}")
print(f"Tools called: {[tc.name for tc in step.tool_calls]}")
# Dynamic tool management
step.add_tool(new_tool)
step.remove_tool("old_tool")
# Early termination
if condition:
step.finish(result="Early exit")
return step
🎯 Examples
Try them live on the Gradio app or browse the source in examples/:
| Example | Category | Description |
|---|---|---|
| Simple Q&A | Basic | Single-turn question answering with structured output |
| HN Production Readiness | Agentic | Checks if a trending HN project is production-ready |
| Documentation Coverage | Agentic | Scores documentation coverage of a GitHub repo (0–100) |
| Bus Factor Calculator | Branching | Calculates the bus factor of a GitHub repository |
| License Compatibility | Agentic | Checks dependency license compatibility for conflicts |
| Dependency Bloat Scanner | Branching | Finds redundant and overlapping libraries in your deps |
🧪 Testing
# Run all tests
pytest
# With coverage
pytest --cov=acorn
# Specific test file
pytest tests/test_agentic_loop.py -v
Current status: 201 tests passing, 85% coverage
🛣️ Roadmap
✅ Completed
- Single-turn execution
- Multi-turn agentic loops
- Tool calling with auto-schema generation
- Parse error recovery
- Dynamic tool management
- Step callbacks
- Streaming responses with partial structured output
- Forced termination strategies
- Provider caching
- Model fallbacks
- Branching workflows
📋 Planned
- Async support
- More docs
- Integration examples with different providers (vector DBs, observability tools, etc.)
🤝 Contributing
Contributions welcome! Please:
- Check open issues for areas to help
- Write tests for new features (maintain >80% coverage)
- Update documentation
- Add examples for new features
🙏 Acknowledgments
Thanks to @rosenbrockc for donating the acorn pip package name.
📄 License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file acorn-0.7.3.tar.gz.
File metadata
- Download URL: acorn-0.7.3.tar.gz
- Upload date:
- Size: 38.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
606487c71078995b30518b0bb00fb4947c21734e4f00e5a80ee1fc78cd4ce311
|
|
| MD5 |
402dc51e97bdbfd960ddca2661e7d8b8
|
|
| BLAKE2b-256 |
5d9bd7a7f369b616b59ba69ee0f4bab2fbcaf3ac8c8292524522e9f22ec82541
|
Provenance
The following attestation bundles were made for acorn-0.7.3.tar.gz:
Publisher:
publish.yml on askmanu/acorn
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
acorn-0.7.3.tar.gz -
Subject digest:
606487c71078995b30518b0bb00fb4947c21734e4f00e5a80ee1fc78cd4ce311 - Sigstore transparency entry: 1173959914
- Sigstore integration time:
-
Permalink:
askmanu/acorn@7f3e8a6a8fb91a3d36d3b2e6946bdb0a32f497e6 -
Branch / Tag:
refs/tags/0.7.3 - Owner: https://github.com/askmanu
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@7f3e8a6a8fb91a3d36d3b2e6946bdb0a32f497e6 -
Trigger Event:
release
-
Statement type:
File details
Details for the file acorn-0.7.3-py3-none-any.whl.
File metadata
- Download URL: acorn-0.7.3-py3-none-any.whl
- Upload date:
- Size: 32.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4ccdef96d8cae1c063cd31307d9de9b4eb17164a021489385f08e6b7c2137e4a
|
|
| MD5 |
41bef7bed55b427f53cdb07c4b161cdc
|
|
| BLAKE2b-256 |
d6ae362a3d5a6682d502dd7138914f71c026a0d940b83528a44ecd7ca0f8e157
|
Provenance
The following attestation bundles were made for acorn-0.7.3-py3-none-any.whl:
Publisher:
publish.yml on askmanu/acorn
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
acorn-0.7.3-py3-none-any.whl -
Subject digest:
4ccdef96d8cae1c063cd31307d9de9b4eb17164a021489385f08e6b7c2137e4a - Sigstore transparency entry: 1173959961
- Sigstore integration time:
-
Permalink:
askmanu/acorn@7f3e8a6a8fb91a3d36d3b2e6946bdb0a32f497e6 -
Branch / Tag:
refs/tags/0.7.3 - Owner: https://github.com/askmanu
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@7f3e8a6a8fb91a3d36d3b2e6946bdb0a32f497e6 -
Trigger Event:
release
-
Statement type: