Skip to main content

Tracing Using OpenTelemetry Python SDK

Project description

LLumo Telemetry SDK (Python)

A powerful telemetry SDK designed to instrument LLM operations via OpenAI, Anthropic, and LangChain and send formatted OpenTelemetry data to your backend telemetry server.

Installation

  1. Create a virtual environment:

    python -m venv .venv
    source .venv/bin/activate  # On Windows: .venv\Scripts\activate
    
  2. Install dependencies:

    pip install -r requirements.txt
    

Setup Guide

Place this initialization setup at the entry point of your application, before you initialize any LLM clients.

from src.python_otel.client import initSDK, TelemetryConfig

# Initialize the telemetry
config = TelemetryConfig(
    endpoint='http://localhost:4455/api/v1/telemetry',  # Your custom telemetry API endpoint
    authToken='your-auth-token',  # Optional Auth Bearer Token
    flushDelayMillis=500  # Span buffer flush interval (def: 500ms)
)

# Pass optional library instances if you need manual instrumentation
# config.libraries = {
#     "OpenAI": openai_client,
#     "Anthropic": anthropic_client
# }

initSDK(config)

print("Telemetry configured successfully.")

Configuration Options

Option Type Required Description
endpoint string Yes The URL of your telemetry ingestion server
authToken string No Optional Bearer token inside Auth header
flushDelayMillis int No Interval to ship logs in milliseconds. Defaults to 500ms
maxExportBatchSize int No Max payload size limits. Defaults to 50
libraries dict No Optional dict for injecting specific AI client instances

Features

  • Built-in Instrumentations: Supports OpenAI, Anthropic, LangChain, requests, and urllib3.
  • Auto Data Sanitation: MongoDB-compliant key formatting automatically escapes problematic fields (. and $) before transmission.
  • Trace Exporters: Uses BatchSpanProcessor with a custom FormattingExporter for structured, ready-to-consume payloads.
  • Performance: Asynchronous-style exporting via OTel's native batching to minimize impact on application latency.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

otel_tracing_python_harsh-0.1.1.tar.gz (4.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

otel_tracing_python_harsh-0.1.1-py3-none-any.whl (4.8 kB view details)

Uploaded Python 3

File details

Details for the file otel_tracing_python_harsh-0.1.1.tar.gz.

File metadata

File hashes

Hashes for otel_tracing_python_harsh-0.1.1.tar.gz
Algorithm Hash digest
SHA256 6535fa4967d778580e85e1029d395f4d48d4e504cb4bd710bedbbc6ab978139b
MD5 59976f85e65db39a303f5a4f23625c3f
BLAKE2b-256 530ada29c2a08ae2901e00a29fb8a4521a3eb50d8cdb56b107e12fc5b0766cc9

See more details on using hashes here.

File details

Details for the file otel_tracing_python_harsh-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for otel_tracing_python_harsh-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b70229e9360a37e6e09a208aba3e9015f8b287a29bdd10749afb0e8568fb35f7
MD5 0d01295df41b090297f9ccff597ab34b
BLAKE2b-256 9ae96bbac5119d680fd7e32784e7e62ee4641c9f241522b9e11bd275fd2347cc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page