Llumo Telemetry SDK for LLM Observability
Project description
Llumo Inference SDK (Python)
A powerful, production-ready telemetry SDK for LLM observability. Automatically capture, group, and export traces from OpenAI, Anthropic, Gemini (Vertex AI), LangChain, and more.
Installation
pip install llumo-inference
(Or if installing from source)
pip install -r requirements.txt
Quick Start
Initialize the SDK at the very beginning of your application.
from llumo_inference import init_telemetry, llumo_trace
# Initialize the telemetry
init_telemetry({
"token": "YOUR_LLUMO_TOKEN",
"playgroundName": "my-llm-app", # Optional: Categories traces in the dashboard
"baseUrl": "https://api.llumo.ai/telemetry" # Optional
})
# Group your activities into a single trace
with llumo_trace("my-session-name"):
# Perform your LLM calls or HTTP requests here
# Everything in this block shares the same Trace ID
pass
Alternative: Using Decorators
from llumo_inference import llumo_workflow
@llumo_workflow(name="customer-query-flow")
def process_query(text):
# All instrumentation inside this function is automatically grouped
pass
Configuration Options
| Option | Type | Required | Description |
|---|---|---|---|
token |
string | Yes | Your Llumo Access Token |
playgroundName |
string | No | Label for your application/playground in the dashboard |
flushDelayMs |
int | No | Buffer flush interval in milliseconds (default: 2000) |
Features
- Automated LLM Instrumentation: Powered by Traceloop to support OpenAI, Anthropic, Gemini, LangChain, and more with zero manual code changes.
- Trace Grouping: Branded context managers (
llumo_trace) and decorators (llumo_workflow) to ensure multi-step AI workflows are unified into single traces. - Buffered Export: Intelligent buffering that flushes traces by ID, ensuring your data arrives at the backend in complete, structured objects.
- Privacy & Safety: Automatic sanitization of sensitive data and keys before transmission.
- Top-level Patching: Immediate instrumentations for
requestsandurllib3to ensure no data is lost during startup.
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llumo_inference-0.1.3.tar.gz.
File metadata
- Download URL: llumo_inference-0.1.3.tar.gz
- Upload date:
- Size: 6.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5db5416523aacab0b537b7d138bcd1f5a10eaa65d42734706ac10371e7c3a5a8
|
|
| MD5 |
2c13880bbbc7360b67a749b80d9772bf
|
|
| BLAKE2b-256 |
7f5f7ddc3700f86633a4f6a3aa51573d4e55be0caae1a4270947480f8fcd96db
|
File details
Details for the file llumo_inference-0.1.3-py3-none-any.whl.
File metadata
- Download URL: llumo_inference-0.1.3-py3-none-any.whl
- Upload date:
- Size: 6.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6e63d377490bc82a50a56dbfac99fd979c01d5859a15f22e05f99763765f5a58
|
|
| MD5 |
6976602010b3eb371885c09ccf327ab8
|
|
| BLAKE2b-256 |
592038b9275620fd5d46e9ce92427689b8d72aff61a7df514caf6753538e19e5
|