Maev AI Agent Observability SDK — one-liner instrumentation for AI agents
Project description
Maev Python SDK
One-liner observability for AI agents.
Installation
pip install maev-sdk
Quickstart
import maev
maev.init(api_key="vl_xxx")
That's it. Add this before your agent runs. Maev automatically instruments OpenAI, Anthropic, LangChain, CrewAI, LiteLLM, Gemini, Cohere, LlamaIndex, and other popular LLM libraries, and sends all telemetry to your Maev dashboard.
Serverless functions (Lambda, Cloud Functions, Vercel, etc.)
In long-running processes (servers, scripts, notebooks), Maev automatically sends all telemetry when the process exits. No extra code needed.
In serverless functions, the process doesn't exit cleanly — it gets frozen or killed by the platform the moment your handler returns. Any telemetry still in the buffer is silently dropped, and your session never closes in the dashboard.
Call maev.flush() at the end of your handler to force delivery before the
freeze:
import maev
maev.init(api_key="vl_xxx", agent_name="My Lambda")
def handler(event, context):
# ... your agent logic ...
maev.flush() # send everything before Lambda freezes
return result
flush() does two things in order:
- Forces the OpenTelemetry exporter to drain any buffered spans (LLM call data)
- Sends a
session.endevent so Maev closes and classifies the session
It is safe to call multiple times — only the first call does anything.
Environments where you must call flush():
| Platform | Why |
|---|---|
| AWS Lambda | Handler returns → process frozen immediately |
| Google Cloud Functions | Same — process suspended after return |
| Vercel / Netlify Functions | Execution context torn down after response |
| Azure Functions | Consumption plan freezes after invocation |
Environments where flush() is optional (but harmless):
- Long-running servers (FastAPI, Flask, Django)
- CLI scripts
- Jupyter notebooks
- Docker containers
How it works
- Telemetry is collected and sent asynchronously — zero impact on your agent's performance.
- Sessions are automatically tracked from the first LLM call through to process exit.
- Failures are detected and classified server-side — no configuration required.
- Every failure triggers an alert in your Maev dashboard and via email.
- Active Maev Fix prompts refresh in the background and inject automatically into supported libraries.
- The SDK prints a terminal notice when a newer
maev-sdkversion is available on PyPI.
Supported Libraries
Maev auto-instruments all major LLM frameworks including:
- OpenAI
- Anthropic
- LangChain
- CrewAI
- LiteLLM
- Google Gemini (
google-generativeai) - Cohere
- LlamaIndex
- Mistral
- AWS Bedrock
- And more
Requirements
- Python >= 3.9
Your API Key
Find your API key in the Maev Dashboard under Settings.
Keys follow the format vl_ followed by 64 hex characters.
Self-hosting
If you are running Maev on your own infrastructure, pass the endpoint parameter:
maev.init(api_key="vl_xxx", endpoint="https://your-maev-instance.com")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file maev_sdk-0.4.6.tar.gz.
File metadata
- Download URL: maev_sdk-0.4.6.tar.gz
- Upload date:
- Size: 22.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c3e897964d8a6cdc8d39194088b02e2a46d86eaa679f5580538e16ff7f83285c
|
|
| MD5 |
7ab020e13d19c7f4a37d59327aa3016a
|
|
| BLAKE2b-256 |
ba29836b1daffe21747adfaac863107777705dd6e0160fa9dffe3ed0a138ab25
|
File details
Details for the file maev_sdk-0.4.6-py3-none-any.whl.
File metadata
- Download URL: maev_sdk-0.4.6-py3-none-any.whl
- Upload date:
- Size: 29.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
385490d6578ce929b686acb7011b440f7b2322424e20bb97f5d9d483f66fba9b
|
|
| MD5 |
34c87af4d2ad189232d0b7ca944e3caa
|
|
| BLAKE2b-256 |
3d1553094712860437774574761eb92cb1c3f19c02836282c38d85dea45b4c11
|