Skip to main content

Python SDK for the Opper Task API

Project description

Opper Python SDK

Python client for the Opper API.

Install

pip install opperai

Quick Start

from opperai import Opper

opper = Opper()  # uses OPPER_API_KEY env var

result = opper.call("summarize", input={"text": "Long article..."})
print(result.data)

# Stream a function
for chunk in opper.stream("summarize", input={"text": "Long article..."}):
    if chunk.type == "content":
        print(chunk.delta, end="")
    if chunk.type == "complete":
        print(chunk.data)

Schema Support

Pass Pydantic models, dataclasses, TypedDicts, or raw JSON Schema dicts for input_schema and output_schema — the SDK resolves them to JSON Schema automatically.

from pydantic import BaseModel

class Summary(BaseModel):
    summary: str
    entities: list[str]

result = opper.call(
    "extract",
    input={"text": "Marie Curie was a physicist in Paris."},
    output_schema=Summary,
)
result.data.summary   # str — typed!
result.data.entities  # list[str]

Dataclasses, TypedDicts, and plain dicts also work. See 01a_using_schemas.py and 01b_using_other_schemas.py.

Observability

Use trace() as a decorator or context manager to group calls under a single trace span. Nesting works naturally.

@opper.trace("my-pipeline")
def run():
    a = opper.call("step-1", input="hello")
    b = opper.call("step-2", input=a.data)

# or as a context manager
with opper.trace("my-pipeline") as span:
    opper.call("step-1", input="hello")

Agent SDK

Build AI agents with tool use, streaming, multi-agent composition, and MCP integration.

from opperai import Agent, tool

@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"Sunny, 22°C in {city}"

agent = Agent(
    name="weather-assistant",
    instructions="You are a helpful weather assistant.",
    tools=[get_weather],
)

# Run — get the final result
result = await agent.run("What's the weather in Paris?")
print(result.output)
print(result.meta.usage)  # token usage across all iterations

# Stream — observe events as the agent works
stream = agent.stream("What's the weather in Paris?")
async for event in stream:
    if event.type == "text_delta":
        print(event.text, end="", flush=True)
    if event.type == "tool_start":
        print(f"\nCalling {event.name}...")
result = await stream.result()

Structured Output

from pydantic import BaseModel

class Sentiment(BaseModel):
    label: str
    score: float

agent = Agent(
    name="analyzer",
    instructions="Analyze the sentiment of the input.",
    output_schema=Sentiment,
)

result = await agent.run("I love this product!")
result.output.label  # str — typed via Pydantic
result.output.score  # float

Multi-Agent Composition

researcher = Agent(name="researcher", instructions="...", tools=[web_search])
writer = Agent(
    name="writer",
    instructions="Write clear reports using research.",
    tools=[researcher.as_tool(name="research", description="Research a topic")],
)

result = await writer.run("Write a report on AI agents")

MCP Integration

from opperai.agent.mcp import mcp, MCPStdioConfig

agent = Agent(
    name="file-assistant",
    instructions="Help users manage files.",
    tools=[mcp(MCPStdioConfig(name="fs", command="uvx", args=["mcp-server-filesystem", "/tmp"]))],
)

Conversation (Multi-Turn)

conversation = agent.conversation()
r1 = await conversation.send("My name is Alice")
r2 = await conversation.send("What is my name?")
# r2.output → "Your name is Alice"

Examples

# Example What it shows
00 First call Simplest possible call
01a Pydantic schemas Type-safe output with Pydantic
01b Other schemas Dataclass, TypedDict, raw dict
02 Streaming Stream deltas + complete event
03a Tools (call) Tool definitions with call()
03b Tools (stream) Tool call chunks in streaming
04a Generate image Image generation
04b Describe image Vision / image description
04c Edit image Image editing
05 Audio Text-to-speech + speech-to-text
06 Video Video generation
07 Embeddings Vector embeddings + similarity
08 Function mgmt List, get, revisions, delete
09 Observability Tracing with decorator + context manager
09b Manual tracing Manual span creation
09c Traces List, get, and inspect traces
10 Models List available models
12 Knowledge base Semantic search with knowledge bases
13 Web tools Web search and URL fetch (beta)

Run a single example:

export OPPER_API_KEY="your-key"
uv run python examples/getting-started/00_your_first_call.py

Run all examples:

uv run python examples/run_all.py

Configuration

Parameter Default Env Var
api_key OPPER_API_KEY
base_url https://api.opper.ai OPPER_BASE_URL
headers {}

Error Handling

from opperai import ApiError

try:
    opper.call("my-fn", input="hello")
except ApiError as e:
    print(e.status, e.body)

Async Support

All methods have _async variants:

result = await opper.call_async("summarize", input={"text": "..."})

async for chunk in opper.stream_async("summarize", input={"text": "..."}):
    print(chunk.delta, end="")

Requirements

  • Python 3.10+
  • Optional: pip install opperai[pydantic] for Pydantic schema support

License

MIT

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opperai-2.0.0b7.tar.gz (2.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

opperai-2.0.0b7-py3-none-any.whl (54.7 kB view details)

Uploaded Python 3

File details

Details for the file opperai-2.0.0b7.tar.gz.

File metadata

  • Download URL: opperai-2.0.0b7.tar.gz
  • Upload date:
  • Size: 2.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for opperai-2.0.0b7.tar.gz
Algorithm Hash digest
SHA256 31d811c661d473c2968297d56dbea236910a338277aef253254dd3d8a9b451cd
MD5 736a980fb06c759588962455859ff91a
BLAKE2b-256 5bfa249649cf00f892df0b0592628a4a56684a7348275d16357b639b4c333086

See more details on using hashes here.

Provenance

The following attestation bundles were made for opperai-2.0.0b7.tar.gz:

Publisher: publish.yml on opper-ai/opper-sdks

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file opperai-2.0.0b7-py3-none-any.whl.

File metadata

  • Download URL: opperai-2.0.0b7-py3-none-any.whl
  • Upload date:
  • Size: 54.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for opperai-2.0.0b7-py3-none-any.whl
Algorithm Hash digest
SHA256 c90ad9a0e5ccabaf1786e90892a438127fbd702ff32fabe27ba55eda439646ee
MD5 1ed223dab883246e3834ab1fd15234ce
BLAKE2b-256 6415583327de3d0a1ad5b83e2894b724741b8dc0b915f790d2e92057aaa87038

See more details on using hashes here.

Provenance

The following attestation bundles were made for opperai-2.0.0b7-py3-none-any.whl:

Publisher: publish.yml on opper-ai/opper-sdks

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page