Skip to main content

Adaptive routing for AI agents. Learns which models work best and routes automatically.

Project description

Kalibr

Adaptive routing for AI agents. Kalibr learns which models work best for your tasks and routes automatically.

PyPI Python License

Installation

pip install kalibr

Setup

Get your credentials from dashboard.kalibr.systems/settings, then:

export KALIBR_API_KEY=your-api-key
export KALIBR_TENANT_ID=your-tenant-id
export OPENAI_API_KEY=sk-...  # or ANTHROPIC_API_KEY for Claude models

Quick Start

from kalibr import Router

router = Router(
    goal="extract_company",
    paths=["gpt-4o", "claude-sonnet-4-20250514"]
)

response = router.completion(
    messages=[{"role": "user", "content": "Extract the company: Hi, I'm Sarah from Stripe."}]
)

router.report(success=True)

Kalibr picks the best model, makes the call, and learns from the outcome.

How It Works

  1. You define paths - models (and optionally tools/params) that can handle your task
  2. Kalibr picks - uses Thompson Sampling to balance exploration vs exploitation
  3. You report outcomes - tell Kalibr if it worked
  4. Kalibr learns - routes more traffic to what works

Paths

A path is a model + optional tools + optional params:

# Just models
paths = ["gpt-4o", "claude-sonnet-4-20250514", "gpt-4o-mini"]

# With tools
paths = [
    {"model": "gpt-4o", "tools": ["web_search"]},
    {"model": "claude-sonnet-4-20250514", "tools": ["web_search", "browser"]},
]

# With params
paths = [
    {"model": "gpt-4o", "params": {"temperature": 0.7}},
    {"model": "gpt-4o", "params": {"temperature": 0.2}},
]

Advanced Path Configuration

Routing Between Parameters

Kalibr can route between different parameter configurations of the same model:

from kalibr import Router

router = Router(
    goal="creative_writing",
    paths=[
        {"model": "gpt-4o", "params": {"temperature": 0.3}},
        {"model": "gpt-4o", "params": {"temperature": 0.9}},
        {"model": "claude-sonnet-4-20250514", "params": {"temperature": 0.7}}
    ]
)

response = router.completion(messages=[...])
router.report(success=True)

Each unique (model, params) combination is tracked separately. Kalibr learns which configuration works best for your specific goal.

Routing Between Tools

router = Router(
    goal="research_task",
    paths=[
        {"model": "gpt-4o", "tools": ["web_search"]},
        {"model": "gpt-4o", "tools": ["code_interpreter"]},
        {"model": "claude-sonnet-4-20250514"}
    ]
)

When to Use get_policy() Instead of Router

For most use cases, use Router. It handles provider dispatching and response conversion automatically.

Use get_policy() for advanced scenarios:

  • Integrating with frameworks like LangChain that wrap LLM calls
  • Custom retry logic or provider-specific features
  • Building tools that need fine-grained control
from kalibr import get_policy, report_outcome

policy = get_policy(goal="summarize")
model = policy["recommended_model"]

# You call the provider yourself
if model.startswith("gpt"):
    client = OpenAI()
    response = client.chat.completions.create(model=model, messages=[...])

report_outcome(trace_id=trace_id, goal="summarize", success=True)

Outcome Reporting

Automatic (with success_when)

router = Router(
    goal="summarize",
    paths=["gpt-4o", "claude-sonnet-4-20250514"],
    success_when=lambda output: len(output) > 100
)

response = router.completion(messages=[...])
# Outcome reported automatically based on success_when

Manual

router = Router(goal="book_meeting", paths=["gpt-4o", "claude-sonnet-4-20250514"])
response = router.completion(messages=[...])

meeting_created = check_calendar_api()
router.report(success=meeting_created)

LangChain Integration

pip install kalibr[langchain]
from kalibr import Router

router = Router(goal="summarize", paths=["gpt-4o", "claude-sonnet-4-20250514"])
llm = router.as_langchain()

chain = prompt | llm | parser

Auto-Instrumentation

Kalibr auto-instruments OpenAI, Anthropic, and Google SDKs on import:

import kalibr  # Must be first import
from openai import OpenAI

client = OpenAI()
response = client.chat.completions.create(model="gpt-4o", messages=[...])
# Traced automatically

Disable with KALIBR_AUTO_INSTRUMENT=false.

Low-Level API

For advanced use cases, you can use the intelligence API directly:

from kalibr import register_path, decide, report_outcome

# Register paths
register_path(goal="book_meeting", model_id="gpt-4o")
register_path(goal="book_meeting", model_id="claude-sonnet-4-20250514")

# Get routing decision
decision = decide(goal="book_meeting")
model = decision["model_id"]

# Make your own LLM call, then report
report_outcome(trace_id="...", goal="book_meeting", success=True)

Other Integrations

pip install kalibr[crewai]        # CrewAI
pip install kalibr[openai-agents] # OpenAI Agents SDK
pip install kalibr[langchain-all] # LangChain with all providers

Configuration

Variable Description Default
KALIBR_API_KEY API key from dashboard Required
KALIBR_TENANT_ID Tenant ID from dashboard Required
KALIBR_AUTO_INSTRUMENT Auto-instrument LLM SDKs true
KALIBR_INTELLIGENCE_URL Intelligence service URL https://kalibr-intelligence.fly.dev

Development

git clone https://github.com/kalibr-ai/kalibr-sdk-python.git
cd kalibr-sdk-python
pip install -e ".[dev]"
pytest

Contributing

See CONTRIBUTING.md.

License

Apache-2.0

Links

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kalibr-1.3.0.tar.gz (101.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kalibr-1.3.0-py3-none-any.whl (106.3 kB view details)

Uploaded Python 3

File details

Details for the file kalibr-1.3.0.tar.gz.

File metadata

  • Download URL: kalibr-1.3.0.tar.gz
  • Upload date:
  • Size: 101.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kalibr-1.3.0.tar.gz
Algorithm Hash digest
SHA256 ab1cdd7696d0782bc70a52320a55ed168e5f61b80e7c9147ddb9f139408af8d0
MD5 14224e0999ca088fc792be1f3e51dc2a
BLAKE2b-256 58e681f1985f1b18c988aeaf4ba7bb8b5665ba3ed2e2ed34f59c160029a2bd01

See more details on using hashes here.

Provenance

The following attestation bundles were made for kalibr-1.3.0.tar.gz:

Publisher: publish.yml on kalibr-ai/kalibr-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kalibr-1.3.0-py3-none-any.whl.

File metadata

  • Download URL: kalibr-1.3.0-py3-none-any.whl
  • Upload date:
  • Size: 106.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kalibr-1.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 aaab84abc16a7f350a319597a7adfe05b538024c5a3139a1e3bf59bc135cad62
MD5 17a1e0e5178a58a0b86ffca5bd9dff63
BLAKE2b-256 b750fcd78c1107a05003a6299237e93ddebfb60f584a6b7c0ebfa7451b04d3b4

See more details on using hashes here.

Provenance

The following attestation bundles were made for kalibr-1.3.0-py3-none-any.whl:

Publisher: publish.yml on kalibr-ai/kalibr-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page