Opper-based agent workflows SDK
Project description
Opper Agent SDK
A Python SDK for building AI agents with Opper Task Completion API. Create intelligent agents that use tools-based reasoning loops with dynamic tool selection, event tracking, and MCP integration.
Table of Contents
- Features
- Getting Started
- Installation
- Quick Start
- Agent as a Tool
- MCP (Model Context Protocol) Integration
- Hooks
- Visualizing Agent Flow
- Monitoring and Tracing
- License
- Support
1. Features
- Reasoning with customizable model: Think → Act reasoning loop with dynamic tool selection
- Extendable tool support: Support for MCP or custom tools with output schemas and examples
- Parallel tool execution: Run independent tool calls concurrently for lower latency
- Event Hooks: Flexible hook system for accessing any internal Agent event
- Composable interface: Agent supports structured input and output schema for ease of integration
- Multi-agent support: Agents can be used as tools for other agents with usage propagation
- Usage & Cost tracking:
run()returns result + detailed token usage and cost breakdown - Type Safety internals: Pydantic model validation throughout execution
- Error Handling: Robust error handling with retry mechanisms
- Tracing & Monitoring: Full observability with Opper's tracing system
2. Getting Started
Building an agent takes three steps:
from opper_agents import Agent, tool
# 1. Define your tools
@tool
def get_weather(city: str) -> str:
"""Get the current weather for a city."""
return f"The weather in {city} is sunny"
# 2. Create the agent
agent = Agent(
name="WeatherBot",
description="Helps with weather queries",
tools=[get_weather]
)
# 3. Run it
run_result = await agent.run("What's the weather in Paris?")
print(run_result.result) # The answer
print(run_result.usage) # Token usage and cost
3. Installation
Prerequisites
- Python >= 3.11
Install from PyPI
pip install opper-agents
Or using UV:
uv pip install opper-agents
Install from Source (For Development)
If you want to contribute or modify the SDK:
- Clone the repository:
git clone https://github.com/opper-ai/opperai-agent-sdk.git
cd opperai-agent-sdk
- Install in editable mode:
# Using pip
pip install -e .
# Or using UV (recommended)
uv pip install -e .
4. Quick Start
4.1. Set up your environment
export OPPER_API_KEY="your-opper-api-key"
Get your API key at platform.opper.ai.
4.2. Explore the Examples
Check out the examples/ directory for working examples:
- Getting Started (
examples/01_getting_started/): Basic agent usage, memory, hooks, parallel execution, tool schemas - MCP Integration (
examples/02_mcp_examples/): Connect to MCP servers - Applied Agents (
examples/applied_agents/): Real-world examples like multi-agent systems - Custom Agents (
examples/custom_agents/): Build specialized agent types
Run any example:
python examples/01_getting_started/01_first_agent.py
4.3. Model Selection
You can specify AI models at the agent level to control which model is used for reasoning. The SDK supports all models available through the Opper API:
# Create agent with specific model
agent = Agent(
name="ClaudeAgent",
description="An agent that uses Claude for reasoning",
tools=[my_tools],
model="anthropic/claude-4-sonnet" # Model for reasoning and tool selection
)
If no model is specified, Opper uses a default model optimized for agent reasoning.
5. Agent as a Tool
Agents can be used as tools by other agents for delegation and specialization:
# Create specialized agents
math_agent = Agent(name="MathAgent", description="Performs calculations")
research_agent = Agent(name="ResearchAgent", description="Explains concepts")
# Use them as tools in a coordinator agent
coordinator = Agent(
name="Coordinator",
tools=[math_agent.as_tool(), research_agent.as_tool()]
)
See examples/01_getting_started/02_agent_as_tool.py for a complete example.
6. MCP (Model Context Protocol) Integration
The SDK supports MCP servers as tool providers, allowing agents to connect to external services. Both stdio and HTTP-SSE transports are supported.
from opper_agents import mcp, MCPServerConfig
# Configure an MCP server
filesystem_server = MCPServerConfig(
name="filesystem",
transport="stdio",
command="docker",
args=["run", "-i", "--rm", "..."]
)
# Use MCP tools in your agent
agent = Agent(
name="FileAgent",
tools=[mcp(filesystem_server)],
)
See examples/02_mcp_examples/ for working examples with filesystem, SQLite, and Composio servers.
7. Hooks
Hooks let you run code at specific points in the agent's lifecycle for logging, monitoring, or custom behavior:
Available hooks: agent_start, agent_end, agent_error, loop_start, loop_end, llm_call, llm_response, think_end, tool_call, tool_result, memory_read, memory_write, memory_error
from opper_agents import hook
from opper_agents.base.context import AgentContext
from opper_agents.base.agent import BaseAgent
@hook("agent_start")
async def log_start(context: AgentContext, agent: BaseAgent, **kwargs):
print(f"Agent {agent.name} starting with goal: {context.goal}")
agent = Agent(
name="MyAgent",
hooks=[log_start],
tools=[...]
)
# Or use the EventEmitter-style API
agent.on("agent_end", my_handler)
agent.once("agent_error", my_error_handler)
See examples/01_getting_started/05_hooks.py for all available hooks with detailed examples.
8. Visualizing Agent Flow
Generate Mermaid diagrams of your agent's structure showing tools, sub-agents, schemas, and hooks. Perfect for documentation and understanding complex multi-agent systems.
agent.visualize_flow(output_path="agent_flow.md")
9. Monitoring and Tracing
The agent provides comprehensive observability for production deployments:
Agent Tracing
- Agent-level spans: Track entire reasoning sessions
- Thought cycles: Monitor think-act iterations
- Tool execution: Performance metrics for each tool call
- Model interactions: AI reasoning and decision making
View your traces in the Opper Dashboard
10. License
This project is licensed under the MIT License - see the LICENSE file for details.
11. Support
- Documentation: Opper Documentation
- Issues: GitHub Issues
- Community: Opper Discord
Built with Opper
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file opper_agents-0.4.0.tar.gz.
File metadata
- Download URL: opper_agents-0.4.0.tar.gz
- Upload date:
- Size: 543.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f2bacaa27427b8b463ac5050564f06c288d75f546ead071eb93543c4f58b1d31
|
|
| MD5 |
9097826d0bc5658f311b1bc1e9fb109c
|
|
| BLAKE2b-256 |
51ddb8d7adbda305c3b7870820f07c6ac9cd318a4a1df014f21f14ff95ce593e
|
Provenance
The following attestation bundles were made for opper_agents-0.4.0.tar.gz:
Publisher:
release.yml on opper-ai/opperai-agent-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
opper_agents-0.4.0.tar.gz -
Subject digest:
f2bacaa27427b8b463ac5050564f06c288d75f546ead071eb93543c4f58b1d31 - Sigstore transparency entry: 955113503
- Sigstore integration time:
-
Permalink:
opper-ai/opperai-agent-sdk@2409f91b15eb702ed3adb2f1d4bc2939658e2afe -
Branch / Tag:
refs/heads/main - Owner: https://github.com/opper-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@2409f91b15eb702ed3adb2f1d4bc2939658e2afe -
Trigger Event:
push
-
Statement type:
File details
Details for the file opper_agents-0.4.0-py3-none-any.whl.
File metadata
- Download URL: opper_agents-0.4.0-py3-none-any.whl
- Upload date:
- Size: 55.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6df2e1e9811e7fca82a7ae4abef89e780bf8de472e45c26fd3f605bfe7973108
|
|
| MD5 |
2650aa96989eabbb270ca390be7a819d
|
|
| BLAKE2b-256 |
ee722dd30d80b7819eebd5e8885b1da2ea7b75a571644f8caf4af4c8874bfca0
|
Provenance
The following attestation bundles were made for opper_agents-0.4.0-py3-none-any.whl:
Publisher:
release.yml on opper-ai/opperai-agent-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
opper_agents-0.4.0-py3-none-any.whl -
Subject digest:
6df2e1e9811e7fca82a7ae4abef89e780bf8de472e45c26fd3f605bfe7973108 - Sigstore transparency entry: 955113509
- Sigstore integration time:
-
Permalink:
opper-ai/opperai-agent-sdk@2409f91b15eb702ed3adb2f1d4bc2939658e2afe -
Branch / Tag:
refs/heads/main - Owner: https://github.com/opper-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@2409f91b15eb702ed3adb2f1d4bc2939658e2afe -
Trigger Event:
push
-
Statement type: