Python client for Overmind API
Project description
Overmind SDK
Automatic observability for LLM applications. One call to init() instruments your existing OpenAI, Anthropic, Google Gemini, or Agno code — no proxy, no key sharing, no import changes.
Features
- Zero-change instrumentation: Keep using your existing LLM clients as-is
- Auto-detection: Detects installed providers automatically, or specify them explicitly
- Custom spans: Add your own tracing spans alongside LLM calls
- User & tag context: Tag traces with user IDs, custom attributes, and exceptions
- OpenTelemetry native: Built on standard OTLP — works with any OTel-compatible backend
Installation
pip install overmind-sdk
Install alongside your LLM provider package:
pip install overmind-sdk openai # OpenAI
pip install overmind-sdk anthropic # Anthropic
pip install overmind-sdk google-genai # Google Gemini
pip install overmind-sdk agno # Agno
Quick Start
1. Get your API key
Sign up at console.overmindlab.ai — your API key is shown immediately after signup.
2. Initialize the SDK
Call init() once at application startup, before any LLM calls:
from overmind_sdk import init
init(
overmind_api_key="ovr_...", # or set OVERMIND_API_KEY env var
service_name="my-service",
environment="production",
)
That's it. Your existing LLM code works unchanged and every call is automatically traced.
3. Use your LLM client as normal
from openai import OpenAI
client = OpenAI() # your existing client, unchanged
response = client.chat.completions.create(
model="gpt-5-mini",
messages=[{"role": "user", "content": "Explain quantum computing"}],
)
print(response.choices[0].message.content)
Traces appear in your Overmind dashboard in real time.
Provider Examples
OpenAI
from overmind_sdk import init
from openai import OpenAI
init(service_name="my-service", providers=["openai"])
client = OpenAI()
response = client.chat.completions.create(
model="gpt-5",
messages=[{"role": "user", "content": "Hello!"}],
)
Anthropic
from overmind_sdk import init
import anthropic
init(service_name="my-service", providers=["anthropic"])
client = anthropic.Anthropic()
message = client.messages.create(
model="claude-opus-4-5",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}],
)
Google Gemini
from overmind_sdk import init
from google import genai
init(service_name="my-service", providers=["google"])
client = genai.Client()
response = client.models.generate_content(
model="gemini-2.0-flash",
contents="Explain quantum computing",
)
Agno
from overmind_sdk import init
from agno.agent import Agent
from agno.models.openai import OpenAIChat
init(service_name="my-service", providers=["agno"])
agent = Agent(model=OpenAIChat(id="gpt-5"), markdown=True)
agent.print_response("Write a short poem about the sea.")
Auto-detect all installed providers
Pass an empty providers list (or omit it) to automatically instrument every supported provider that is installed:
from overmind_sdk import init
init(service_name="my-service") # auto-detects openai, anthropic, google, agno
Configuration
init() parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
overmind_api_key |
str | None |
None |
Your Overmind API key. Falls back to OVERMIND_API_KEY env var. |
service_name |
str | None |
None |
Name of your service (shown in traces). Also reads OVERMIND_SERVICE_NAME. Defaults to "unknown-service". |
environment |
str | None |
None |
Deployment environment ("production", "staging", etc.). Also reads OVERMIND_ENVIRONMENT. Defaults to "development". |
providers |
list[str] | None |
None |
Providers to instrument. Supported: "openai", "anthropic", "google", "agno". None or empty = auto-detect. |
overmind_base_url |
str | None |
None |
Override the Overmind API URL. Falls back to OVERMIND_API_URL env var, then https://api.overmindlab.ai. |
Environment variables
| Variable | Description |
|---|---|
OVERMIND_API_KEY |
Your Overmind API key |
OVERMIND_SERVICE_NAME |
Service name (overridden by service_name param) |
OVERMIND_ENVIRONMENT |
Environment name (overridden by environment param) |
OVERMIND_API_URL |
Custom API endpoint URL |
Additional SDK Functions
get_tracer()
Get the OpenTelemetry tracer to create custom spans around any block of code:
from overmind_sdk import init, get_tracer
init(service_name="my-service")
tracer = get_tracer()
with tracer.start_as_current_span("process-document") as span:
span.set_attribute("document.id", doc_id)
result = process(doc)
set_user()
Tag the current trace with user identity. Call this in your request handler or middleware:
from overmind_sdk import set_user
# In a FastAPI middleware:
@app.middleware("http")
async def add_user_context(request: Request, call_next):
if request.state.user:
set_user(
user_id=request.state.user.id,
email=request.state.user.email,
)
return await call_next(request)
| Parameter | Required | Description |
|---|---|---|
user_id |
Yes | Unique identifier for the user |
email |
No | User's email address |
username |
No | User's display name |
set_tag()
Add a custom attribute to the current span:
from overmind_sdk import set_tag
set_tag("feature.flag", "new-checkout-flow")
set_tag("tenant.id", tenant_id)
capture_exception()
Record an exception on the current span and mark it as an error:
from overmind_sdk import capture_exception
try:
result = risky_llm_call()
except Exception as e:
capture_exception(e)
raise
Full Example
import os
from overmind_sdk import init, get_tracer, set_user, set_tag, capture_exception
from openai import OpenAI
os.environ["OVERMIND_API_KEY"] = "ovr_your_key_here"
init(
service_name="customer-support",
environment="production",
providers=["openai"],
)
client = OpenAI()
def handle_query(user_id: str, question: str) -> str:
set_user(user_id=user_id)
set_tag("workflow", "support")
tracer = get_tracer()
with tracer.start_as_current_span("handle-support-query"):
try:
response = client.chat.completions.create(
model="gpt-5-mini",
messages=[
{"role": "system", "content": "You are a helpful customer support agent."},
{"role": "user", "content": question},
],
)
return response.choices[0].message.content
except Exception as e:
capture_exception(e)
raise
answer = handle_query("user-123", "How do I reset my password?")
print(answer)
What Happens After Your First Traces
Once Overmind has collected 10+ traces for a given prompt pattern, the optimization engine starts automatically:
- Agent detection — extracts prompt templates from your traces
- LLM judge scoring — evaluates each trace against auto-generated quality criteria
- Prompt experimentation — generates and tests candidate prompt variations
- Model backtesting — replays traces through alternative models to find cost/quality tradeoffs
- Suggestions — surfaces the best alternatives in your dashboard
See How Optimization Works for details.
We appreciate any feedback or suggestions. Reach out at support@overmindlab.ai
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file overmind_sdk-0.1.34.tar.gz.
File metadata
- Download URL: overmind_sdk-0.1.34.tar.gz
- Upload date:
- Size: 18.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e8740a512ac19b00cc80ae18cd6f4c13763178eebceb6f3e75a21bbcc4c08d3d
|
|
| MD5 |
16e72b7436087a2a3052a143e07f7e61
|
|
| BLAKE2b-256 |
25735f5f244cc6c4a01ce0a2fe85cd9308d3380d6e1b5bf431217b498cbbf402
|
Provenance
The following attestation bundles were made for overmind_sdk-0.1.34.tar.gz:
Publisher:
publish.yml on overmind-core/overmind-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
overmind_sdk-0.1.34.tar.gz -
Subject digest:
e8740a512ac19b00cc80ae18cd6f4c13763178eebceb6f3e75a21bbcc4c08d3d - Sigstore transparency entry: 1156410209
- Sigstore integration time:
-
Permalink:
overmind-core/overmind-python@f399bf216d14b44c2609e0e01a1bd577b25558ff -
Branch / Tag:
refs/pull/50/merge - Owner: https://github.com/overmind-core
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@f399bf216d14b44c2609e0e01a1bd577b25558ff -
Trigger Event:
pull_request
-
Statement type:
File details
Details for the file overmind_sdk-0.1.34-py3-none-any.whl.
File metadata
- Download URL: overmind_sdk-0.1.34-py3-none-any.whl
- Upload date:
- Size: 20.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
700c8642f0f1451ad808b90588b2d6fd9319533f5cbe357688a22170bf34ace6
|
|
| MD5 |
9e6af05d821cf84c16c6655aaa410283
|
|
| BLAKE2b-256 |
07c36c7de9fd995d051ac787d23cb490a8624f6c25486ec3f86df6d1fd5259a4
|
Provenance
The following attestation bundles were made for overmind_sdk-0.1.34-py3-none-any.whl:
Publisher:
publish.yml on overmind-core/overmind-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
overmind_sdk-0.1.34-py3-none-any.whl -
Subject digest:
700c8642f0f1451ad808b90588b2d6fd9319533f5cbe357688a22170bf34ace6 - Sigstore transparency entry: 1156410212
- Sigstore integration time:
-
Permalink:
overmind-core/overmind-python@f399bf216d14b44c2609e0e01a1bd577b25558ff -
Branch / Tag:
refs/pull/50/merge - Owner: https://github.com/overmind-core
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@f399bf216d14b44c2609e0e01a1bd577b25558ff -
Trigger Event:
pull_request
-
Statement type: