Skip to main content

Llumo Telemetry SDK for LLM Observability

Project description

Llumo Inference SDK (Python)

A powerful, production-ready telemetry SDK for LLM observability. Automatically capture, group, and export traces from OpenAI, Anthropic, Gemini (Vertex AI), LangChain, and more.

Installation

pip install llumo-inference

(Or if installing from source)

pip install -r requirements.txt

Quick Start

Initialize the SDK at the very beginning of your application.

from llumo_inference import init_telemetry, llumo_trace

# Initialize the telemetry
init_telemetry({
    "token": "YOUR_LLUMO_TOKEN",
    "playgroundName": "my-llm-app", # Optional: Categories traces in the dashboard
    "baseUrl": "https://api.llumo.ai/telemetry" # Optional
})

# Group your activities into a single trace
with llumo_trace("my-session-name"):
    # Perform your LLM calls or HTTP requests here
    # Everything in this block shares the same Trace ID
    pass

Alternative: Using Decorators

from llumo_inference import llumo_workflow

@llumo_workflow(name="customer-query-flow")
def process_query(text):
    # All instrumentation inside this function is automatically grouped
    pass

Configuration Options

Option Type Required Description
token string Yes Your Llumo Access Token
playgroundName string No Label for your application/playground in the dashboard
flushDelayMs int No Buffer flush interval in milliseconds (default: 2000)

Features

  • Automated LLM Instrumentation: Powered by Traceloop to support OpenAI, Anthropic, Gemini, LangChain, and more with zero manual code changes.
  • Trace Grouping: Branded context managers (llumo_trace) and decorators (llumo_workflow) to ensure multi-step AI workflows are unified into single traces.
  • Buffered Export: Intelligent buffering that flushes traces by ID, ensuring your data arrives at the backend in complete, structured objects.
  • Privacy & Safety: Automatic sanitization of sensitive data and keys before transmission.
  • Top-level Patching: Immediate instrumentations for requests and urllib3 to ensure no data is lost during startup.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llumo_inference-0.1.2.tar.gz (6.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llumo_inference-0.1.2-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file llumo_inference-0.1.2.tar.gz.

File metadata

  • Download URL: llumo_inference-0.1.2.tar.gz
  • Upload date:
  • Size: 6.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for llumo_inference-0.1.2.tar.gz
Algorithm Hash digest
SHA256 ac6535ec999e911a9a9a6c93475aba5fd4bac02e950706d65a3c81c987b6741d
MD5 33155f8ffe7c439172f74e7677bc4eff
BLAKE2b-256 ead8be724de4d12948114f5a4c8c5497ea714f98f96efbc9c609c7f1351861bd

See more details on using hashes here.

File details

Details for the file llumo_inference-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for llumo_inference-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9c77a0887b71adf5d3d6e57cb217cdf45aef09f4c7c14b04683357df8fdc6c7b
MD5 dee3f624ef3a0b80411c62a00752b174
BLAKE2b-256 4391104fdc341a74cd2e9b61cf7744abf88f5a969b5854431769d3c1713e472f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page