Python client for Overmind API
Project description
Overmind Python SDK
Automatic observability for LLM applications. One call to overmind.init() instruments your existing OpenAI, Anthropic, Google Gemini, or Agno code — no proxy, no key sharing, no import changes.
What is Overmind?
Overmind automatically optimizes your AI agents — better prompts, better models, lower cost. It collects execution traces, evaluates them with LLM judges, and recommends better prompts and models to reduce cost, improve quality, and lower latency.
- Zero-change instrumentation: Keep using your existing LLM clients as-is
- Auto-detection: Detects installed providers automatically, or specify them explicitly
- Custom spans: Add your own tracing spans alongside LLM calls
- User & tag context: Tag traces with user IDs, custom attributes, and exceptions
- OpenTelemetry native: Built on standard OTLP — works with any OTel-compatible backend
- Managed service: console.overmindlab.ai
- Self-hosted (open-source): github.com/overmind-core/overmind
- Docs: docs.overmindlab.ai
Installation
pip install overmind
Install alongside your LLM provider package:
pip install overmind openai # OpenAI
pip install overmind anthropic # Anthropic
pip install overmind google-genai # Google Gemini
pip install overmind agno # Agno
Quick Start
1. Get your API key
Sign up at console.overmindlab.ai — your API key is shown immediately after signup.
2. Initialize the SDK
Call overmind.init() once at application startup, before any LLM calls:
import overmind
overmind.init(
overmind_api_key="ovr_...", # or set OVERMIND_API_KEY env var
service_name="my-service",
environment="production",
)
That's it. Your existing LLM code works unchanged and every call is automatically traced.
3. Use your LLM client as normal
from openai import OpenAI
client = OpenAI() # your existing client, unchanged
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Explain quantum computing"}],
)
print(response.choices[0].message.content)
Traces appear in your Overmind dashboard in real time.
Provider Examples
OpenAI
import overmind
from openai import OpenAI
overmind.init(service_name="my-service", providers=["openai"])
client = OpenAI()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}],
)
Anthropic
import overmind
import anthropic
overmind.init(service_name="my-service", providers=["anthropic"])
client = anthropic.Anthropic()
message = client.messages.create(
model="claude-haiku-4-5",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}],
)
Google Gemini
import overmind
from google import genai
overmind.init(service_name="my-service", providers=["google"])
client = genai.Client()
response = client.models.generate_content(
model="gemini-2.0-flash",
contents="Explain quantum computing",
)
Agno
import overmind
from agno.agent import Agent
from agno.models.openai import OpenAIChat
overmind.init(service_name="my-service", providers=["agno"])
agent = Agent(model=OpenAIChat(id="gpt-4o-mini"), markdown=True, name="Storyteller")
agent.print_response("Write a short poem about the sea.")
Auto-detect all installed providers
Omit providers (or pass an empty list) to automatically instrument every supported provider that is installed:
import overmind
overmind.init(service_name="my-service") # auto-detects openai, anthropic, google, agno
Configuration
overmind.init() parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
overmind_api_key |
str | None |
None |
Your Overmind API key. Falls back to OVERMIND_API_KEY env var. |
service_name |
str |
"unknown-service" |
Name of your service (shown in traces). Also reads OVERMIND_SERVICE_NAME. |
environment |
str |
"development" |
Deployment environment ("production", "staging", etc.). Also reads OVERMIND_ENVIRONMENT. |
providers |
list[str] | None |
None |
Providers to instrument. Supported: "openai", "anthropic", "google", "agno". None or [] = auto-detect all installed. |
overmind_base_url |
str | None |
None |
Override the Overmind API URL. Falls back to OVERMIND_API_URL or https://api.overmindlab.ai. |
Environment variables
| Variable | Description |
|---|---|
OVERMIND_API_KEY |
Your Overmind API key |
OVERMIND_SERVICE_NAME |
Service name (overridden by service_name param) |
OVERMIND_ENVIRONMENT |
Environment name (overridden by environment param) |
OVERMIND_API_URL |
Custom API endpoint URL |
Self-Hosted
The SDK works with both the managed service and the self-hosted open-source edition. API keys prefixed with ovr_core_ are automatically routed to http://localhost:8000. You can also set OVERMIND_API_URL to point to your own deployment.
Additional SDK Functions
overmind.get_tracer()
Get the OpenTelemetry tracer to create custom spans around any block of code:
import overmind
overmind.init(service_name="my-service")
tracer = overmind.get_tracer()
with tracer.start_as_current_span("process-document") as span:
span.set_attribute("document.id", doc_id)
result = process(doc)
overmind.set_user()
Tag the current trace with user identity. Call this in your request handler or middleware:
import overmind
# In a FastAPI middleware:
@app.middleware("http")
async def add_user_context(request: Request, call_next):
if request.state.user:
overmind.set_user(
user_id=request.state.user.id,
email=request.state.user.email,
)
return await call_next(request)
| Parameter | Required | Description |
|---|---|---|
user_id |
Yes | Unique identifier for the user |
email |
No | User's email address |
username |
No | User's display name |
overmind.set_tag()
Add a custom attribute to the current span:
overmind.set_tag("feature.flag", "new-checkout-flow")
overmind.set_tag("tenant.id", tenant_id)
overmind.capture_exception()
Record an exception on the current span and mark it as an error:
try:
result = risky_llm_call()
except Exception as e:
overmind.capture_exception(e)
raise
Full Example
import os
import overmind
from openai import OpenAI
os.environ["OVERMIND_API_KEY"] = "ovr_your_key_here"
overmind.init(
service_name="customer-support",
environment="production",
providers=["openai"],
)
client = OpenAI()
def handle_query(user_id: str, question: str) -> str:
overmind.set_user(user_id=user_id)
tracer = overmind.get_tracer()
with tracer.start_as_current_span("handle-support-query"):
try:
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": "You are a helpful customer support agent."},
{"role": "user", "content": question},
],
)
return response.choices[0].message.content
except Exception as e:
overmind.capture_exception(e)
raise
answer = handle_query("user-123", "How do I reset my password?")
print(answer)
What Happens After Your First Traces
Once Overmind has collected 30+ traces for a given prompt pattern, the optimization engine starts automatically:
- Agent detection — extracts prompt templates from your traces
- LLM judge scoring — evaluates each trace against auto-generated quality criteria
- Prompt experimentation — generates and tests candidate prompt variations
- Model backtesting — replays traces through alternative models to find cost/quality tradeoffs
- Suggestions — surfaces the best alternatives in your dashboard
See How Optimization Works for details.
Documentation
License
MIT
We appreciate any feedback or suggestions. Reach out at support@overmindlab.ai
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file overmind-0.1.31.tar.gz.
File metadata
- Download URL: overmind-0.1.31.tar.gz
- Upload date:
- Size: 48.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
34adea273b7445d4f29e24c160f7a1220c5dee7872df368de89f5344f644f9af
|
|
| MD5 |
487ca9f517f66d01354970434ecfbb94
|
|
| BLAKE2b-256 |
b5e122223715afc9552afc625e017b3999a002f196619658b26ce7637b5bcec2
|
Provenance
The following attestation bundles were made for overmind-0.1.31.tar.gz:
Publisher:
publish.yml on overmind-core/overmind-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
overmind-0.1.31.tar.gz -
Subject digest:
34adea273b7445d4f29e24c160f7a1220c5dee7872df368de89f5344f644f9af - Sigstore transparency entry: 1050355292
- Sigstore integration time:
-
Permalink:
overmind-core/overmind-python@d742946d36b0527ac30c73781571a5e8c38b56ec -
Branch / Tag:
refs/heads/main - Owner: https://github.com/overmind-core
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@d742946d36b0527ac30c73781571a5e8c38b56ec -
Trigger Event:
push
-
Statement type:
File details
Details for the file overmind-0.1.31-py3-none-any.whl.
File metadata
- Download URL: overmind-0.1.31-py3-none-any.whl
- Upload date:
- Size: 62.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
13679a88d0864eef359ce95e68561905ecd3967ce0d8c68e2fa52143f4d3adba
|
|
| MD5 |
1adc6a5cc1d39f0c85276b45cb87e700
|
|
| BLAKE2b-256 |
3437bb47de671c5b28c649b2e94ac89a3e5b6bc5bfa69f4d42344de353d2848b
|
Provenance
The following attestation bundles were made for overmind-0.1.31-py3-none-any.whl:
Publisher:
publish.yml on overmind-core/overmind-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
overmind-0.1.31-py3-none-any.whl -
Subject digest:
13679a88d0864eef359ce95e68561905ecd3967ce0d8c68e2fa52143f4d3adba - Sigstore transparency entry: 1050355303
- Sigstore integration time:
-
Permalink:
overmind-core/overmind-python@d742946d36b0527ac30c73781571a5e8c38b56ec -
Branch / Tag:
refs/heads/main - Owner: https://github.com/overmind-core
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@d742946d36b0527ac30c73781571a5e8c38b56ec -
Trigger Event:
push
-
Statement type: