Enterprise-grade Observability and Evaluation SDK for Voice Agents
Project description
VoiceEval SDK (Python)
VoiceEval is an enterprise-grade observability and evaluation SDK designed specifically for Voice Agents and LLM-powered applications. It provides detailed tracing, latency breakdown, and cost analysis with zero configuration.
🚀 Key Features
- 🔎 Zero-Config Auto-Instrumentation: Automatically detects and traces calls from major LLM providers (OpenAI, Anthropic, Google Gemini) without any code changes.
- 🛡️ Secure Ingestion Proxy: All traces are sent through a secure proxy (
server/), separating your application logic from downstream observability backends (like Langfuse). This ensures you maintain full control over your data and API keys. - ⚡ High Performance: Built on top of
OpenTelemetry, utilizing efficient asynchronous Batch exports (OTLP/HTTP) to ensure negligible runtime overhead. - 🧩 Standardized Data Model: Uses standard OTel semantic conventions, making your data portable and interoperable with any OTel-compatible backend.
📦 Installation
Install the SDK via pip (or uv):
pip install voiceeval-sdk
# or
uv add voiceeval-sdk
For local development:
git clone https://github.com/voiceeval/voiceeval-sdk.git
cd voiceeval-sdk
pip install -e .
🏁 Quickstart
1. Initialize the Client
Initialize the Client at the start of your application. This single line sets up the OTel exporter and enables auto-instrumentation for all installed LLM libraries.
from voiceeval import Client
# Initialize SDK - connects to your local proxy or prod endpoint
client = Client(
api_key="your_voiceeval_api_key", # or set VOICE_EVAL_API_KEY env var
base_url="http://api.voiceeval.com/v1/traces"
)
2. Run Your Agent
That's it! Any calls to supported libraries like openai or anthropic are now automatically traced.
from openai import OpenAI
# No manual wrapping needed!
client_openai = OpenAI()
response = client_openai.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello world"}]
)
3. Manual Tracing (Optional)
For functions that don't call LLMs (like your business logic or RAG pipeline), use the @observe decorator:
from voiceeval import observe
@observe(name_override="rag_retrieval")
def retrieve_documents(query: str):
# Your complex logic here
return docs
🔌 Supported Providers
The SDK automatically instruments the following libraries if they are found in your environment:
| Provider | Library | Status |
|---|---|---|
| OpenAI | openai |
✅ Auto-Instrumented |
| Anthropic | anthropic |
✅ Auto-Instrumented |
| Google Gemini | google-generativeai |
✅ Auto-Instrumented |
Note: If a library is not installed, the SDK gracefully skips it.
📄 License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file voiceeval_sdk-0.1.8.tar.gz.
File metadata
- Download URL: voiceeval_sdk-0.1.8.tar.gz
- Upload date:
- Size: 65.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.1 {"installer":{"name":"uv","version":"0.11.1","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1ccc1687bfe702db615668ea62513f010a5596e661440b4c7b20d7678e481645
|
|
| MD5 |
0a700f6304e9ee9b8c2718d376a869f7
|
|
| BLAKE2b-256 |
40d48d9badc211b3077d07badd076c3cd5cb3e1b120dfde64051716e86917b8a
|
File details
Details for the file voiceeval_sdk-0.1.8-py3-none-any.whl.
File metadata
- Download URL: voiceeval_sdk-0.1.8-py3-none-any.whl
- Upload date:
- Size: 16.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.1 {"installer":{"name":"uv","version":"0.11.1","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f44e66ed5eb04a2005050d23dcb1d82394e0e55331227f434250bf096784bc6c
|
|
| MD5 |
38533a3ac6ba59f6e3b8c2670dbb3082
|
|
| BLAKE2b-256 |
c7bab00fcdca00320ab649f855a0fecad8af646f577e5c8056abb6434d2162b0
|