Skip to main content

Drop-in OpenTelemetry GenAI observability for any LLM backend — local or cloud.

Project description

llm-otel-kit

Drop-in OpenTelemetry GenAI observability for any LLM backend — local or cloud.

What it does

llm-otel-kit gives you full OTel GenAI semantic convention coverage for any LLM provider in ~10 lines of code:

  • Traces with gen_ai.* span attributes (model, tokens, latency, streaming mode)
  • Metrics — 10 instruments: operation duration, token usage, TTFT, TPOT, throughput, error rate, active requests
  • Logs exported via OTLP with structured context (model, duration, token counts)
  • Dynatrace-ready — correct temporality (DELTA for counters/histograms, CUMULATIVE for UpDownCounters)

Supported Providers

Provider Type Config name
Ollama Local ollama
OpenAI Cloud openai
Anthropic Cloud anthropic
vLLM Local vllm
llama.cpp Local llamacpp
LM Studio Local lmstudio
Groq Cloud groq
Together Cloud together
Fireworks Cloud fireworks
Azure OpenAI Cloud azure_openai
LiteLLM Proxy litellm

Quick Start

from llm_otel_kit import AppConfig, GenAIMetrics, init_observability, create_provider

config = AppConfig.from_env()
otel = init_observability(config.app_name, config.otlp_endpoint, config.otlp_token)
provider = create_provider(config.provider)
m = GenAIMetrics(otel.meter)

# Use provider.complete() / provider.stream() for instrumented LLM calls

Environment Variables

Variable Default Description
LLM_PROVIDER ollama Provider name (see table above)
LLM_BASE_URL http://localhost:11434 Provider API base URL
LLM_API_KEY (empty) API key for cloud providers
DEFAULT_MODEL (empty) Fallback model name
APP_NAME llm-backend OTel service name
TRACELOOP_BASE_URL (empty) OTLP endpoint URL
DT_OTLP_TOKEN (empty) Dynatrace API token

Install

pip install llm-otel-kit

For Anthropic support:

pip install llm-otel-kit[anthropic]

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_otel_kit-0.1.0.tar.gz (11.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_otel_kit-0.1.0-py3-none-any.whl (16.2 kB view details)

Uploaded Python 3

File details

Details for the file llm_otel_kit-0.1.0.tar.gz.

File metadata

  • Download URL: llm_otel_kit-0.1.0.tar.gz
  • Upload date:
  • Size: 11.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.2

File hashes

Hashes for llm_otel_kit-0.1.0.tar.gz
Algorithm Hash digest
SHA256 33d6d2e94f9fe9da8c66ad439ee3a5b521381b818067787fa85ccdaf3d4fd84f
MD5 024dc7000f883240217eddd47e159331
BLAKE2b-256 25a6ab954319e44841033678b3c41fbdb5ef560c0427a706e6110ee55e881477

See more details on using hashes here.

File details

Details for the file llm_otel_kit-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: llm_otel_kit-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 16.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.2

File hashes

Hashes for llm_otel_kit-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 34f3f9afdbf65b28d650d49f6257fc6100a91c21540e756b8fc2915a136f7807
MD5 42fe1ca5639644b36b5283de6d5de5df
BLAKE2b-256 7e9f26495bad2a83aa62b6616b3fcc41fa5faae6fdb32f84499f219aac900405

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page