Skip to main content

LLM agent framework with structured I/O

Project description

github

LLM agent framework with structured I/O

Build AI agents with type-safe inputs and outputs, automatic tool calling, and powerful agentic loops.

Tests Coverage Python License


✨ Features

  • 🎯 Structured I/O - Pydantic models for inputs and outputs
  • 🤖 Agentic Loops - Multi-turn execution with tool calling
  • 🛠️ Auto Tool Schemas - Generate from type hints and docstrings
  • 🔄 Dynamic Tools - Add/remove tools during execution
  • Parse Error Recovery - Automatic retry on validation failures
  • 📊 Step Callbacks - Full control over loop behavior
  • 🔌 LiteLLM Integration - Works with any LLM provider
  • 🌊 Streaming Responses - Real-time output with partial structured updates
  • 💾 Provider Caching - Reduce latency and cost with prompt caching
  • 🛡️ Model Fallbacks - Automatic provider failover for high availability
  • 🌳 Branching Workflows - Spawn sub-agents that extend parent capabilities for parallel analysis and map-reduce patterns

🚀 Quick Start

Installation

pip install acorn

Set your API key:

# For Anthropic Claude
export ANTHROPIC_API_KEY="your-key-here"

# Or for OpenAI
export OPENAI_API_KEY="your-key-here"

# Or any other LiteLLM-supported provider

Single-Turn Example

from pydantic import BaseModel, Field
from acorn import Module

class Input(BaseModel):
    text: str = Field(description="The text to summarize")
    max_words: int = Field(default=100, description="Maximum words in summary")

class Output(BaseModel):
    summary: str = Field(description="The concise summary")
    word_count: int = Field(description="Number of words in summary")

class Summarizer(Module):
    """Summarize text concisely."""

    initial_input = Input
    final_output = Output
    model = "anthropic/claude-sonnet-4-5-20250514"

# Use it
summarizer = Summarizer()
result = summarizer(
    text="Long article text here...",
    max_words=50
)

print(result.summary)
print(f"Words: {result.word_count}")

Multi-Turn Agentic Loop

from pydantic import BaseModel, Field
from acorn import Module, tool

class Input(BaseModel):
    topic: str = Field(description="Research topic")
    depth: str = Field(default="shallow", description="Research depth")

class Output(BaseModel):
    findings: str = Field(description="Summary of findings")
    sources: list[str] = Field(description="Sources consulted")

class ResearchAgent(Module):
    """Research assistant with tools."""

    initial_input = Input
    max_steps = 5  # Enable agentic loop
    final_output = Output
    model = "anthropic/claude-sonnet-4-5-20250514"

    @tool
    def search(self, query: str) -> list:
        """Search for information."""
        # Your search implementation
        return ["result1", "result2"]

    @tool
    def analyze(self, data: str) -> str:
        """Analyze collected data."""
        # Your analysis implementation
        return f"Analysis: {data}"

    def on_step(self, step):
        """Called after each step."""
        print(f"Step {step.counter}")

        # Early termination if done
        if len(step.tool_results) >= 3:
            step.finish(
                findings="Sufficient data collected",
                sources=["source1", "source2"]
            )

        return step

# Use it
agent = ResearchAgent()
result = agent(topic="Large Language Models", depth="shallow")

📖 Documentation

askmanu.github.io/acorn


📚 Core Concepts

Module

Base class for LLM agents. Configure with:

  • model - LLM to use (required - no default)
  • temperature - Sampling temperature
  • max_tokens - Maximum tokens to generate
  • max_steps - Max agentic loop iterations (None = single-turn)
  • initial_input - Pydantic model for input schema
  • final_output - Pydantic model for output schema
  • tools - List of available tools
  • cache - Enable provider-level prompt caching
  • model_fallbacks - List of fallback models for automatic failover

Tools

Functions the LLM can call:

@tool
def search(query: str, limit: int = 10) -> list:
    """Search for information.

    Args:
        query: The search query
        limit: Maximum results to return
    """
    return search_api(query, limit)

Schema is automatically generated from type hints and docstring.

Step Callback

Control agentic loop execution:

def on_step(self, step):
    # Access step info
    print(f"Step {step.counter}")
    print(f"Tools called: {[tc.name for tc in step.tool_calls]}")

    # Dynamic tool management
    step.add_tool(new_tool)
    step.remove_tool("old_tool")

    # Early termination
    if condition:
        step.finish(result="Early exit")

    return step

🎯 Examples

Try them live on the Gradio app or browse the source in examples/:

Example Category Description
Simple Q&A Basic Single-turn question answering with structured output
HN Production Readiness Agentic Checks if a trending HN project is production-ready
Documentation Coverage Agentic Scores documentation coverage of a GitHub repo (0–100)
Bus Factor Calculator Branching Calculates the bus factor of a GitHub repository
License Compatibility Agentic Checks dependency license compatibility for conflicts
Dependency Bloat Scanner Branching Finds redundant and overlapping libraries in your deps

🧪 Testing

# Run all tests
pytest

# With coverage
pytest --cov=acorn

# Specific test file
pytest tests/test_agentic_loop.py -v

Current status: 201 tests passing, 85% coverage


🛣️ Roadmap

✅ Completed

  • Single-turn execution
  • Multi-turn agentic loops
  • Tool calling with auto-schema generation
  • Parse error recovery
  • Dynamic tool management
  • Step callbacks
  • Streaming responses with partial structured output
  • Forced termination strategies
  • Provider caching
  • Model fallbacks
  • Branching workflows

📋 Planned

  • Async support
  • More docs
  • Integration examples with different providers (vector DBs, observability tools, etc.)

🤝 Contributing

Contributions welcome! Please:

  1. Check open issues for areas to help
  2. Write tests for new features (maintain >80% coverage)
  3. Update documentation
  4. Add examples for new features

🙏 Acknowledgments

Thanks to @rosenbrockc for donating the acorn pip package name.


📄 License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

acorn-0.7.2.tar.gz (38.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

acorn-0.7.2-py3-none-any.whl (32.7 kB view details)

Uploaded Python 3

File details

Details for the file acorn-0.7.2.tar.gz.

File metadata

  • Download URL: acorn-0.7.2.tar.gz
  • Upload date:
  • Size: 38.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acorn-0.7.2.tar.gz
Algorithm Hash digest
SHA256 d4bc4c3b7808ecfe2cf02add09300824eebeeb42bf930286c998b8e5923e8664
MD5 b2b5848124eee6d77f8119ba43c86987
BLAKE2b-256 998f98d73e9e88523b0a5e862d77897f33bfe33fe7c9cada4b4194cb44c362e1

See more details on using hashes here.

Provenance

The following attestation bundles were made for acorn-0.7.2.tar.gz:

Publisher: publish.yml on askmanu/acorn

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file acorn-0.7.2-py3-none-any.whl.

File metadata

  • Download URL: acorn-0.7.2-py3-none-any.whl
  • Upload date:
  • Size: 32.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for acorn-0.7.2-py3-none-any.whl
Algorithm Hash digest
SHA256 12512629883980fab6ea3711ee3de6f1b80447e588a8a200761bda9788e6849a
MD5 ee76284f57a9e2e49e4a1716c20044ef
BLAKE2b-256 3c275d7caf08acd3ca61e841cb0a0fc8efaa790d818965d466028ec3c5da9d61

See more details on using hashes here.

Provenance

The following attestation bundles were made for acorn-0.7.2-py3-none-any.whl:

Publisher: publish.yml on askmanu/acorn

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page