Skip to main content

Unified LLM Observability & Multi-Model AI Integration Framework - Deploy to GPT, Claude, Gemini, Copilot with full telemetry.

Project description

Kalibr Python SDK

Production-grade observability and execution intelligence for LLM applications. Automatically instrument OpenAI, Anthropic, and Google AI SDKs with zero code changes.

PyPI version Python License

Features

  • Zero-code instrumentation - Automatic tracing for OpenAI, Anthropic, and Google AI SDKs
  • Outcome-conditioned routing - Query for optimal models based on historical success rates
  • TraceCapsule - Cross-agent context propagation for multi-agent systems
  • Cost tracking - Real-time cost calculation for all LLM calls
  • Token monitoring - Track input/output tokens across providers
  • Framework integrations - LangChain, CrewAI, OpenAI Agents SDK

Installation

pip install kalibr

Quick Start

Auto-instrumentation (Recommended)

Simply import kalibr at the start of your application - all LLM calls are automatically traced:

import kalibr  # Must be FIRST import
from openai import OpenAI

client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}]
)
# That's it. The call is automatically traced.

Manual Tracing with @trace Decorator

For more control, use the @trace decorator:

from kalibr import trace
from openai import OpenAI

@trace(operation="summarize", provider="openai", model="gpt-4o")
def summarize_text(text: str) -> str:
    client = OpenAI()
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "system", "content": "Summarize the following text."},
            {"role": "user", "content": text}
        ]
    )
    return response.choices[0].message.content

Multi-Provider Example

import kalibr
from openai import OpenAI
from anthropic import Anthropic

# Both are automatically traced
openai_client = OpenAI()
anthropic_client = Anthropic()

gpt_response = openai_client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Explain quantum computing"}]
)

claude_response = anthropic_client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Explain machine learning"}]
)

Outcome-Conditioned Routing

Query Kalibr for optimal model recommendations based on real execution outcomes:

from kalibr import get_policy, report_outcome

# Before executing - get the best model for your goal
policy = get_policy(goal="book_meeting")
print(f"Use {policy['recommended_model']} - {policy['outcome_success_rate']:.0%} success rate")

# Execute with the recommended model
# ...

# After executing - report what happened
report_outcome(
    trace_id="abc123",
    goal="book_meeting",
    success=True
)

With Constraints

from kalibr import get_policy

policy = get_policy(
    goal="resolve_ticket",
    constraints={
        "max_cost_usd": 0.05,
        "max_latency_ms": 3000,
        "min_quality": 0.8
    }
)

TraceCapsule - Cross-Agent Tracing

Propagate trace context across agent boundaries:

from kalibr import TraceCapsule, get_or_create_capsule

# Agent 1: Create capsule and add hop
capsule = get_or_create_capsule()
capsule.append_hop({
    "provider": "openai",
    "operation": "chat_completion",
    "model": "gpt-4o",
    "duration_ms": 150,
    "cost_usd": 0.002,
    "status": "success"
})

# Pass to Agent 2 via HTTP header
headers = {"X-Kalibr-Capsule": capsule.to_json()}

# Agent 2: Receive and continue
capsule = TraceCapsule.from_json(headers["X-Kalibr-Capsule"])
capsule.append_hop({
    "provider": "anthropic",
    "operation": "chat_completion",
    "model": "claude-3-5-sonnet-20241022",
    "duration_ms": 200,
    "cost_usd": 0.003,
    "status": "success"
})

Framework Integrations

LangChain

pip install kalibr[langchain]
from kalibr_langchain import KalibrCallbackHandler
from langchain_openai import ChatOpenAI

handler = KalibrCallbackHandler()
llm = ChatOpenAI(model="gpt-4o", callbacks=[handler])
response = llm.invoke("What is the capital of France?")

See LangChain Integration Guide for full documentation.

CrewAI

pip install kalibr[crewai]
from kalibr_crewai import KalibrCrewAIInstrumentor
from crewai import Agent, Task, Crew

instrumentor = KalibrCrewAIInstrumentor()
instrumentor.instrument()

# Use CrewAI normally - all operations are traced

See CrewAI Integration Guide for full documentation.

OpenAI Agents SDK

pip install kalibr[openai-agents]
from kalibr_openai_agents import setup_kalibr_tracing
from agents import Agent, Runner

setup_kalibr_tracing()

agent = Agent(name="Assistant", instructions="You are helpful.")
result = Runner.run_sync(agent, "Hello!")

See OpenAI Agents Integration Guide for full documentation.

Configuration

Configure via environment variables:

Variable Description Default
KALIBR_API_KEY API key for authentication Required
KALIBR_TENANT_ID Tenant identifier default
KALIBR_COLLECTOR_URL Collector endpoint URL https://api.kalibr.systems/api/ingest
KALIBR_INTELLIGENCE_URL Intelligence API URL https://kalibr-intelligence.fly.dev
KALIBR_SERVICE_NAME Service name for spans kalibr-app
KALIBR_ENVIRONMENT Environment (prod/staging/dev) prod
KALIBR_WORKFLOW_ID Workflow identifier default
KALIBR_AUTO_INSTRUMENT Enable auto-instrumentation true

CLI Commands

# Serve your app with tracing
kalibr serve myapp.py

# Run with managed runtime
kalibr run myapp.py --port 8000

# Deploy to cloud platforms
kalibr deploy myapp.py --runtime fly.io

# Fetch trace capsule by ID
kalibr capsule <trace-id>

# Show version
kalibr version

Supported Providers

Provider Models Auto-Instrumentation
OpenAI GPT-4, GPT-4o, GPT-3.5 Yes
Anthropic Claude 3.5 Sonnet, Claude 3 Opus/Sonnet/Haiku Yes
Google Gemini Pro, Gemini Flash Yes

Development

git clone https://github.com/kalibr-ai/kalibr-sdk-python.git
cd kalibr-sdk-python

pip install -e ".[dev]"

# Run tests
pytest

# Format code
black kalibr/
ruff check kalibr/

Contributing

We welcome contributions! See CONTRIBUTING.md.

License

Apache 2.0 - see LICENSE.

Links

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kalibr-1.2.0.tar.gz (79.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kalibr-1.2.0-py3-none-any.whl (92.2 kB view details)

Uploaded Python 3

File details

Details for the file kalibr-1.2.0.tar.gz.

File metadata

  • Download URL: kalibr-1.2.0.tar.gz
  • Upload date:
  • Size: 79.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kalibr-1.2.0.tar.gz
Algorithm Hash digest
SHA256 22e1799546988f6b39f4a3545548f3f083d14fd1c7ca26c182fc30b8fed0fd29
MD5 6713ddf0c56da2af2ac9c2883c4b1127
BLAKE2b-256 6d70cb849229df34ae914d629448777d7635c9ea72d840358338f07dfc460ae9

See more details on using hashes here.

Provenance

The following attestation bundles were made for kalibr-1.2.0.tar.gz:

Publisher: publish.yml on kalibr-ai/kalibr-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kalibr-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: kalibr-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 92.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kalibr-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b7935e649ca87ddcd06f3516e5bf830e3d5ba0dabb840d867ffcea3435b1ab93
MD5 6d7c769fa400ff13f8e733efe4daf4a6
BLAKE2b-256 627a8d2e568e37d56d5664157e489c2e512f50c482502775e00bbdef7f6fd54e

See more details on using hashes here.

Provenance

The following attestation bundles were made for kalibr-1.2.0-py3-none-any.whl:

Publisher: publish.yml on kalibr-ai/kalibr-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page