Skip to main content

Python SDK for the Opper Task API

Project description

Opper Python SDK

Python client for the Opper API.

Install

pip install opperai

Quick Start

from opperai import Opper

opper = Opper()  # uses OPPER_API_KEY env var

result = opper.call("summarize", input={"text": "Long article..."})
print(result.data)

# Stream a function
for chunk in opper.stream("summarize", input={"text": "Long article..."}):
    if chunk.type == "content":
        print(chunk.delta, end="")
    if chunk.type == "complete":
        print(chunk.data)

Schema Support

Pass Pydantic models, dataclasses, TypedDicts, or raw JSON Schema dicts for input_schema and output_schema — the SDK resolves them to JSON Schema automatically.

from pydantic import BaseModel

class Summary(BaseModel):
    summary: str
    entities: list[str]

result = opper.call(
    "extract",
    input={"text": "Marie Curie was a physicist in Paris."},
    output_schema=Summary,
)
result.data.summary   # str — typed!
result.data.entities  # list[str]

Dataclasses, TypedDicts, and plain dicts also work. See 01a_using_schemas.py and 01b_using_other_schemas.py.

Observability

Use trace() as a decorator or context manager to group calls under a single trace span. Nesting works naturally.

@opper.trace("my-pipeline")
def run():
    a = opper.call("step-1", input="hello")
    b = opper.call("step-2", input=a.data)

# or as a context manager
with opper.trace("my-pipeline") as span:
    opper.call("step-1", input="hello")

Agent SDK

Build AI agents with tool use, streaming, multi-agent composition, and MCP integration.

from opperai import Agent, tool

@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"Sunny, 22°C in {city}"

agent = Agent(
    name="weather-assistant",
    instructions="You are a helpful weather assistant.",
    tools=[get_weather],
)

# Run — get the final result
result = await agent.run("What's the weather in Paris?")
print(result.output)
print(result.meta.usage)  # token usage across all iterations

# Stream — observe events as the agent works
stream = agent.stream("What's the weather in Paris?")
async for event in stream:
    if event.type == "text_delta":
        print(event.text, end="", flush=True)
    if event.type == "tool_start":
        print(f"\nCalling {event.name}...")
result = await stream.result()

Structured Output

from pydantic import BaseModel

class Sentiment(BaseModel):
    label: str
    score: float

agent = Agent(
    name="analyzer",
    instructions="Analyze the sentiment of the input.",
    output_schema=Sentiment,
)

result = await agent.run("I love this product!")
result.output.label  # str — typed via Pydantic
result.output.score  # float

Multi-Agent Composition

researcher = Agent(name="researcher", instructions="...", tools=[web_search])
writer = Agent(
    name="writer",
    instructions="Write clear reports using research.",
    tools=[researcher.as_tool(name="research", description="Research a topic")],
)

result = await writer.run("Write a report on AI agents")

MCP Integration

from opperai.agent.mcp import mcp, MCPStdioConfig

agent = Agent(
    name="file-assistant",
    instructions="Help users manage files.",
    tools=[mcp(MCPStdioConfig(name="fs", command="uvx", args=["mcp-server-filesystem", "/tmp"]))],
)

Conversation (Multi-Turn)

conversation = agent.conversation()
r1 = await conversation.send("My name is Alice")
r2 = await conversation.send("What is my name?")
# r2.output → "Your name is Alice"

Examples

# Example What it shows
00 First call Simplest possible call
01a Pydantic schemas Type-safe output with Pydantic
01b Other schemas Dataclass, TypedDict, raw dict
02 Streaming Stream deltas + complete event
03a Tools (call) Tool definitions with call()
03b Tools (stream) Tool call chunks in streaming
04a Generate image Image generation
04b Describe image Vision / image description
04c Edit image Image editing
05 Audio Text-to-speech + speech-to-text
06 Video Video generation
07 Embeddings Vector embeddings + similarity
08 Function mgmt List, get, revisions, delete
09 Observability Tracing with decorator + context manager
09b Manual tracing Manual span creation
09c Traces List, get, and inspect traces
10 Models List available models
12 Knowledge base Semantic search with knowledge bases
13 Web tools Web search and URL fetch (beta)

Run a single example:

export OPPER_API_KEY="your-key"
uv run python examples/getting-started/00_your_first_call.py

Run all examples:

uv run python examples/run_all.py

Configuration

Parameter Default Env Var
api_key OPPER_API_KEY
base_url https://api.opper.ai OPPER_BASE_URL
headers {}

Error Handling

from opperai import ApiError

try:
    opper.call("my-fn", input="hello")
except ApiError as e:
    print(e.status, e.body)

Async Support

All methods have _async variants:

result = await opper.call_async("summarize", input={"text": "..."})

async for chunk in opper.stream_async("summarize", input={"text": "..."}):
    print(chunk.delta, end="")

Requirements

  • Python 3.10+
  • Optional: pip install opperai[pydantic] for Pydantic schema support

License

MIT

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opperai-2.0.0b10.tar.gz (2.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

opperai-2.0.0b10-py3-none-any.whl (57.6 kB view details)

Uploaded Python 3

File details

Details for the file opperai-2.0.0b10.tar.gz.

File metadata

  • Download URL: opperai-2.0.0b10.tar.gz
  • Upload date:
  • Size: 2.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for opperai-2.0.0b10.tar.gz
Algorithm Hash digest
SHA256 4ba50fb76847ba420ffc3d98c27dd113d68b9e059bdff834df22c1483a5b52e6
MD5 92b0bd0c32d3169780efb71f13f3d22a
BLAKE2b-256 543a9fd210a01945386af8abe15eade4e5f54f76dbffa8706673fdbca2cd606b

See more details on using hashes here.

Provenance

The following attestation bundles were made for opperai-2.0.0b10.tar.gz:

Publisher: publish.yml on opper-ai/opper-sdks

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file opperai-2.0.0b10-py3-none-any.whl.

File metadata

  • Download URL: opperai-2.0.0b10-py3-none-any.whl
  • Upload date:
  • Size: 57.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for opperai-2.0.0b10-py3-none-any.whl
Algorithm Hash digest
SHA256 f2b9372353996d1e54d2e6d46e58b880352469f55eef6f25481997c7f5acec2a
MD5 3b442a3334cb7d3d3888cfe53dc250a1
BLAKE2b-256 20f7f0e8ba8c0298ae11aa005d719096f73f0370b9a57672e690e4028858ab2c

See more details on using hashes here.

Provenance

The following attestation bundles were made for opperai-2.0.0b10-py3-none-any.whl:

Publisher: publish.yml on opper-ai/opper-sdks

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page