Skip to main content

Python SDK for the Opper Task API

Project description

Opper Python SDK

Python client for the Opper API.

Install

pip install opperai

Quick Start

from opperai import Opper

opper = Opper()  # uses OPPER_API_KEY env var

result = opper.call("summarize", input={"text": "Long article..."})
print(result.data)

# Stream a function
for chunk in opper.stream("summarize", input={"text": "Long article..."}):
    if chunk.type == "content":
        print(chunk.delta, end="")
    if chunk.type == "complete":
        print(chunk.data)

Schema Support

Pass Pydantic models, dataclasses, TypedDicts, or raw JSON Schema dicts for input_schema and output_schema — the SDK resolves them to JSON Schema automatically.

from pydantic import BaseModel

class Summary(BaseModel):
    summary: str
    entities: list[str]

result = opper.call(
    "extract",
    input={"text": "Marie Curie was a physicist in Paris."},
    output_schema=Summary,
)
result.data.summary   # str — typed!
result.data.entities  # list[str]

Dataclasses, TypedDicts, and plain dicts also work. See 01a_using_schemas.py and 01b_using_other_schemas.py.

Observability

Use trace() as a decorator or context manager to group calls under a single trace span. Nesting works naturally.

@opper.trace("my-pipeline")
def run():
    a = opper.call("step-1", input="hello")
    b = opper.call("step-2", input=a.data)

# or as a context manager
with opper.trace("my-pipeline") as span:
    opper.call("step-1", input="hello")

Agent SDK

Build AI agents with tool use, streaming, multi-agent composition, and MCP integration.

from opperai import Agent, tool

@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"Sunny, 22°C in {city}"

agent = Agent(
    name="weather-assistant",
    instructions="You are a helpful weather assistant.",
    tools=[get_weather],
)

# Run — get the final result
result = await agent.run("What's the weather in Paris?")
print(result.output)
print(result.meta.usage)  # token usage across all iterations

# Stream — observe events as the agent works
stream = agent.stream("What's the weather in Paris?")
async for event in stream:
    if event.type == "text_delta":
        print(event.text, end="", flush=True)
    if event.type == "tool_start":
        print(f"\nCalling {event.name}...")
result = await stream.result()

Structured Output

from pydantic import BaseModel

class Sentiment(BaseModel):
    label: str
    score: float

agent = Agent(
    name="analyzer",
    instructions="Analyze the sentiment of the input.",
    output_schema=Sentiment,
)

result = await agent.run("I love this product!")
result.output.label  # str — typed via Pydantic
result.output.score  # float

Multi-Agent Composition

researcher = Agent(name="researcher", instructions="...", tools=[web_search])
writer = Agent(
    name="writer",
    instructions="Write clear reports using research.",
    tools=[researcher.as_tool(name="research", description="Research a topic")],
)

result = await writer.run("Write a report on AI agents")

MCP Integration

from opperai.agent.mcp import mcp, MCPStdioConfig

agent = Agent(
    name="file-assistant",
    instructions="Help users manage files.",
    tools=[mcp(MCPStdioConfig(name="fs", command="uvx", args=["mcp-server-filesystem", "/tmp"]))],
)

Conversation (Multi-Turn)

conversation = agent.conversation()
r1 = await conversation.send("My name is Alice")
r2 = await conversation.send("What is my name?")
# r2.output → "Your name is Alice"

Examples

# Example What it shows
00 First call Simplest possible call
01a Pydantic schemas Type-safe output with Pydantic
01b Other schemas Dataclass, TypedDict, raw dict
02 Streaming Stream deltas + complete event
03a Tools (call) Tool definitions with call()
03b Tools (stream) Tool call chunks in streaming
04a Generate image Image generation
04b Describe image Vision / image description
04c Edit image Image editing
05 Audio Text-to-speech + speech-to-text
06 Video Video generation
07 Embeddings Vector embeddings + similarity
08 Function mgmt List, get, revisions, delete
09 Observability Tracing with decorator + context manager
09b Manual tracing Manual span creation
09c Traces List, get, and inspect traces
10 Models List available models
12 Knowledge base Semantic search with knowledge bases
13 Web tools Web search and URL fetch (beta)

Run a single example:

export OPPER_API_KEY="your-key"
uv run python examples/getting-started/00_your_first_call.py

Run all examples:

uv run python examples/run_all.py

Configuration

Parameter Default Env Var
api_key OPPER_API_KEY
base_url https://api.opper.ai OPPER_BASE_URL
headers {}

Error Handling

from opperai import ApiError

try:
    opper.call("my-fn", input="hello")
except ApiError as e:
    print(e.status, e.body)

Async Support

All methods have _async variants:

result = await opper.call_async("summarize", input={"text": "..."})

async for chunk in opper.stream_async("summarize", input={"text": "..."}):
    print(chunk.delta, end="")

Requirements

  • Python 3.10+
  • Optional: pip install opperai[pydantic] for Pydantic schema support

License

MIT

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opperai-2.0.0b9.tar.gz (2.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

opperai-2.0.0b9-py3-none-any.whl (57.4 kB view details)

Uploaded Python 3

File details

Details for the file opperai-2.0.0b9.tar.gz.

File metadata

  • Download URL: opperai-2.0.0b9.tar.gz
  • Upload date:
  • Size: 2.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for opperai-2.0.0b9.tar.gz
Algorithm Hash digest
SHA256 860e08ad8f6011c56c23b5906862ab04c522fc4f01dc2d419eb5fc4c645f8d19
MD5 0aea7a068d53e702ccedd834a711c157
BLAKE2b-256 1965ba8e4a52940e09992f5b76dc84da9f0710821a2aef9764e3a81f44d94c8d

See more details on using hashes here.

Provenance

The following attestation bundles were made for opperai-2.0.0b9.tar.gz:

Publisher: publish.yml on opper-ai/opper-sdks

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file opperai-2.0.0b9-py3-none-any.whl.

File metadata

  • Download URL: opperai-2.0.0b9-py3-none-any.whl
  • Upload date:
  • Size: 57.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for opperai-2.0.0b9-py3-none-any.whl
Algorithm Hash digest
SHA256 0549bff64630e5a59b51cc6e16cfe0de8b31a2ea88713a8249cbe88287ec0e1e
MD5 4edc5a904abec0b7500766c039bf536d
BLAKE2b-256 698eaac57b843d5a77c8e4686a20ded078dbf83c6e5a9eff86939d027abd19b9

See more details on using hashes here.

Provenance

The following attestation bundles were made for opperai-2.0.0b9-py3-none-any.whl:

Publisher: publish.yml on opper-ai/opper-sdks

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page