Skip to main content

Unified LLM Observability & Multi-Model AI Integration Framework - Deploy to GPT, Claude, Gemini, Copilot with full telemetry.

Project description

Kalibr Python SDK

Production-grade observability for LLM applications. Automatically instrument OpenAI, Anthropic, and Google AI SDKs with zero code changes.

Features

  • Zero-code instrumentation - Automatic tracing for OpenAI, Anthropic, and Google AI
  • Cost tracking - Real-time cost calculation for all LLM calls
  • Token monitoring - Track input/output tokens across providers
  • Parent-child traces - Automatic trace relationship management
  • Multi-provider support - Works with GPT-4, Claude, Gemini, and more

Installation

pip install kalibr

Quick Start

Auto-instrumentation (Recommended)

Simply import kalibr at the start of your application - all LLM calls are automatically traced:

import kalibr  # Enable auto-instrumentation
import openai

# Set your Kalibr API key
import os
os.environ["KALIBR_API_KEY"] = "your-kalibr-api-key"

# All OpenAI calls are now automatically traced
client = openai.OpenAI()
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}]
)

Manual Tracing with Decorator

For more control, use the @trace decorator:

from kalibr import trace
import openai

@trace(operation="summarize", provider="openai", model="gpt-4o")
def summarize_text(text: str) -> str:
    client = openai.OpenAI()
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "system", "content": "Summarize the following text."},
            {"role": "user", "content": text}
        ]
    )
    return response.choices[0].message.content

result = summarize_text("Your long text here...")

Multi-Provider Example

import kalibr
import openai
import anthropic

# OpenAI call - automatically traced
openai_client = openai.OpenAI()
gpt_response = openai_client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Explain quantum computing"}]
)

# Anthropic call - automatically traced
anthropic_client = anthropic.Anthropic()
claude_response = anthropic_client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Explain machine learning"}]
)

Configuration

Configure the SDK using environment variables:

Variable Description Default
KALIBR_API_KEY API key for authentication Required
KALIBR_COLLECTOR_URL Collector endpoint URL http://localhost:8001/api/ingest
KALIBR_TENANT_ID Tenant identifier for multi-tenant setups default
KALIBR_WORKFLOW_ID Workflow identifier for grouping traces default
KALIBR_SERVICE_NAME Service name for OpenTelemetry spans kalibr-app
KALIBR_ENVIRONMENT Environment (prod, staging, dev) prod
KALIBR_AUTO_INSTRUMENT Enable/disable auto-instrumentation true
KALIBR_CONSOLE_EXPORT Enable console span export for debugging false

CLI Tools

The SDK includes command-line tools for running and deploying applications:

# Run your app locally with tracing
kalibr serve myapp.py

# Run with managed runtime lifecycle
kalibr run myapp.py --port 8000

# Deploy to cloud platforms
kalibr deploy myapp.py --runtime fly.io

# Fetch trace data by ID
kalibr capsule <trace-id>

Supported Providers

Provider Models Auto-Instrumentation
OpenAI GPT-4, GPT-4o, GPT-3.5 Yes
Anthropic Claude 3 Opus, Sonnet, Haiku Yes
Google Gemini Pro, Gemini Flash Yes

Examples

See the examples/ directory for complete examples:

  • basic_example.py - Simple tracing example
  • basic_agent.py - Agent with auto-instrumentation
  • advanced_example.py - Advanced tracing patterns
  • cross_vendor.py - Multi-provider workflows
  • test_mas.py - Multi-agent system demonstration

Development

# Clone the repository
git clone https://github.com/kalibr-ai/kalibr-sdk-python.git
cd kalibr-sdk-python

# Install in development mode
pip install -e ".[dev]"

# Run tests
pytest

# Format code
black kalibr/
ruff check kalibr/

Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

License

MIT License - see LICENSE for details.

Links

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kalibr-1.1.3a0.tar.gz (71.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kalibr-1.1.3a0-py3-none-any.whl (85.2 kB view details)

Uploaded Python 3

File details

Details for the file kalibr-1.1.3a0.tar.gz.

File metadata

  • Download URL: kalibr-1.1.3a0.tar.gz
  • Upload date:
  • Size: 71.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kalibr-1.1.3a0.tar.gz
Algorithm Hash digest
SHA256 c2ffde63f3d0e27e1c45b203a075b8b10f0d196dfa646780f8169125a5c82335
MD5 84cef21c61f05f231f32645ff8e980f7
BLAKE2b-256 e20ae610bde718af2dc46682ce75053adae716b08737b5a37730565aae92641f

See more details on using hashes here.

Provenance

The following attestation bundles were made for kalibr-1.1.3a0.tar.gz:

Publisher: publish.yml on kalibr-ai/kalibr-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kalibr-1.1.3a0-py3-none-any.whl.

File metadata

  • Download URL: kalibr-1.1.3a0-py3-none-any.whl
  • Upload date:
  • Size: 85.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kalibr-1.1.3a0-py3-none-any.whl
Algorithm Hash digest
SHA256 58d8dbe9fb1d2786bff9e3c49c9a5011a0f6bdf717e9917c99e0a5034b2dd016
MD5 a177a533abf8101f1d2f6cb72c61e710
BLAKE2b-256 63d1bc3bd7b361127cd8dc381fbd1979bb029524777e76b93a571d7039cb4927

See more details on using hashes here.

Provenance

The following attestation bundles were made for kalibr-1.1.3a0-py3-none-any.whl:

Publisher: publish.yml on kalibr-ai/kalibr-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page