Skip to main content

Adaptive routing for AI agents. Learns which models work best and routes automatically.

Project description

Kalibr

Adaptive routing for AI agents. Kalibr learns which models work best for your tasks and routes automatically.

PyPI Python License

Requirements

  • Python 3.10 or higher
  • pip 21.0 or higher

Installation

pip install kalibr

For accurate token counting, install with:

pip install kalibr[tokens]

Setup

Get your credentials from dashboard.kalibr.systems/settings, then:

export KALIBR_API_KEY=your-api-key
export KALIBR_TENANT_ID=your-tenant-id
export OPENAI_API_KEY=sk-...  # or ANTHROPIC_API_KEY for Claude models

Quick Start

from kalibr import Router

router = Router(
    goal="extract_company",
    paths=["gpt-4o", "claude-sonnet-4-20250514"]
)

response = router.completion(
    messages=[{"role": "user", "content": "Extract the company: Hi, I'm Sarah from Stripe."}]
)

router.report(success=True)

Kalibr picks the best model, makes the call, and learns from the outcome.

How It Works

  1. You define paths - models (and optionally tools/params) that can handle your task
  2. Kalibr picks - uses Thompson Sampling to balance exploration vs exploitation
  3. You report outcomes - tell Kalibr if it worked
  4. Kalibr learns - routes more traffic to what works

Paths

A path is a model + optional tools + optional params:

# Just models
paths = ["gpt-4o", "claude-sonnet-4-20250514", "gpt-4o-mini"]

# With tools
paths = [
    {"model": "gpt-4o", "tools": ["web_search"]},
    {"model": "claude-sonnet-4-20250514", "tools": ["web_search", "browser"]},
]

# With params
paths = [
    {"model": "gpt-4o", "params": {"temperature": 0.7}},
    {"model": "gpt-4o", "params": {"temperature": 0.2}},
]

Advanced Path Configuration

Routing Between Parameters

Kalibr can route between different parameter configurations of the same model:

from kalibr import Router

router = Router(
    goal="creative_writing",
    paths=[
        {"model": "gpt-4o", "params": {"temperature": 0.3}},
        {"model": "gpt-4o", "params": {"temperature": 0.9}},
        {"model": "claude-sonnet-4-20250514", "params": {"temperature": 0.7}}
    ]
)

response = router.completion(messages=[...])
router.report(success=True)

Each unique (model, params) combination is tracked separately. Kalibr learns which configuration works best for your specific goal.

Routing Between Tools

router = Router(
    goal="research_task",
    paths=[
        {"model": "gpt-4o", "tools": ["web_search"]},
        {"model": "gpt-4o", "tools": ["code_interpreter"]},
        {"model": "claude-sonnet-4-20250514"}
    ]
)

When to Use get_policy() Instead of Router

For most use cases, use Router. It handles provider dispatching and response conversion automatically.

Use get_policy() for advanced scenarios:

  • Integrating with frameworks like LangChain that wrap LLM calls
  • Custom retry logic or provider-specific features
  • Building tools that need fine-grained control
from kalibr import get_policy, report_outcome

policy = get_policy(goal="summarize")
model = policy["recommended_model"]

# You call the provider yourself
if model.startswith("gpt"):
    client = OpenAI()
    response = client.chat.completions.create(model=model, messages=[...])

report_outcome(trace_id=trace_id, goal="summarize", success=True)

Outcome Reporting

Automatic (with success_when)

router = Router(
    goal="summarize",
    paths=["gpt-4o", "claude-sonnet-4-20250514"],
    success_when=lambda output: len(output) > 100
)

response = router.completion(messages=[...])
# Outcome reported automatically based on success_when

Manual

router = Router(goal="book_meeting", paths=["gpt-4o", "claude-sonnet-4-20250514"])
response = router.completion(messages=[...])

meeting_created = check_calendar_api()
router.report(success=meeting_created)

LangChain Integration

pip install kalibr[langchain]
from kalibr import Router

router = Router(goal="summarize", paths=["gpt-4o", "claude-sonnet-4-20250514"])
llm = router.as_langchain()

chain = prompt | llm | parser

Auto-Instrumentation

Kalibr auto-instruments OpenAI, Anthropic, and Google SDKs on import:

import kalibr  # Must be first import
from openai import OpenAI

client = OpenAI()
response = client.chat.completions.create(model="gpt-4o", messages=[...])
# Traced automatically

Disable with KALIBR_AUTO_INSTRUMENT=false.

Low-Level API

For advanced use cases, you can use the intelligence API directly:

from kalibr import register_path, decide, report_outcome

# Register paths
register_path(goal="book_meeting", model_id="gpt-4o")
register_path(goal="book_meeting", model_id="claude-sonnet-4-20250514")

# Get routing decision
decision = decide(goal="book_meeting")
model = decision["model_id"]

# Make your own LLM call, then report
report_outcome(trace_id="...", goal="book_meeting", success=True)

Other Integrations

pip install kalibr[tokens]        # Accurate token counting (tiktoken)
pip install kalibr[crewai]        # CrewAI
pip install kalibr[openai-agents] # OpenAI Agents SDK
pip install kalibr[langchain-all] # LangChain with all providers

Configuration

Variable Description Default
KALIBR_API_KEY API key from dashboard Required
KALIBR_TENANT_ID Tenant ID from dashboard Required
KALIBR_AUTO_INSTRUMENT Auto-instrument LLM SDKs true
KALIBR_INTELLIGENCE_URL Intelligence service URL https://kalibr-intelligence.fly.dev

Development

git clone https://github.com/kalibr-ai/kalibr-sdk-python.git
cd kalibr-sdk-python
pip install -e ".[dev]"
pytest

Contributing

See CONTRIBUTING.md.

License

Apache-2.0

Links

Project details


Release history Release notifications | RSS feed

This version

1.4.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kalibr-1.4.1.tar.gz (102.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kalibr-1.4.1-py3-none-any.whl (106.9 kB view details)

Uploaded Python 3

File details

Details for the file kalibr-1.4.1.tar.gz.

File metadata

  • Download URL: kalibr-1.4.1.tar.gz
  • Upload date:
  • Size: 102.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.1

File hashes

Hashes for kalibr-1.4.1.tar.gz
Algorithm Hash digest
SHA256 11f563ea5ea27fbe537cfd1434f35b2b75327554023b3ef07bfa896dffb73d0a
MD5 77e32f5e81254fc604223c88ddd8c774
BLAKE2b-256 06c63a1ef7f73859ac21846651c0e2e18ee8e538e5a58f68fd8f2d7f0343a0fa

See more details on using hashes here.

File details

Details for the file kalibr-1.4.1-py3-none-any.whl.

File metadata

  • Download URL: kalibr-1.4.1-py3-none-any.whl
  • Upload date:
  • Size: 106.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.1

File hashes

Hashes for kalibr-1.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 bfaea553f2ceea4af3c77b0f60e64ee19d83678065e9a21a7404bed46b126a4d
MD5 a68c31826dd2d6766a2437c605696f8a
BLAKE2b-256 f8e5f4817dc3da392e91a39247741a57c313c47ad8d174ac4f598a9a70e0c6b0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page