Skip to main content

Adaptive routing for AI agents. Learns which models work best and routes automatically.

Project description

Kalibr

Adaptive routing for AI agents. Kalibr learns which models work best for your tasks and routes automatically.

PyPI Python License

Requirements

  • Python 3.10 or higher
  • pip 21.0 or higher

Installation

pip install kalibr

For accurate token counting, install with:

pip install kalibr[tokens]

Setup

Get your credentials from dashboard.kalibr.systems/settings, then:

export KALIBR_API_KEY=your-api-key
export KALIBR_TENANT_ID=your-tenant-id
export OPENAI_API_KEY=sk-...  # or ANTHROPIC_API_KEY for Claude models

Quick Start

from kalibr import Router

router = Router(
    goal="extract_company",
    paths=["gpt-4o", "claude-sonnet-4-20250514"]
)

response = router.completion(
    messages=[{"role": "user", "content": "Extract the company: Hi, I'm Sarah from Stripe."}]
)

router.report(success=True)

Kalibr picks the best model, makes the call, and learns from the outcome.

How It Works

  1. You define paths - models (and optionally tools/params) that can handle your task
  2. Kalibr picks - uses Thompson Sampling to balance exploration vs exploitation
  3. You report outcomes - tell Kalibr if it worked
  4. Kalibr learns - routes more traffic to what works

Paths

A path is a model + optional tools + optional params:

# Just models
paths = ["gpt-4o", "claude-sonnet-4-20250514", "gpt-4o-mini"]

# With tools
paths = [
    {"model": "gpt-4o", "tools": ["web_search"]},
    {"model": "claude-sonnet-4-20250514", "tools": ["web_search", "browser"]},
]

# With params
paths = [
    {"model": "gpt-4o", "params": {"temperature": 0.7}},
    {"model": "gpt-4o", "params": {"temperature": 0.2}},
]

Advanced Path Configuration

Routing Between Parameters

Kalibr can route between different parameter configurations of the same model:

from kalibr import Router

router = Router(
    goal="creative_writing",
    paths=[
        {"model": "gpt-4o", "params": {"temperature": 0.3}},
        {"model": "gpt-4o", "params": {"temperature": 0.9}},
        {"model": "claude-sonnet-4-20250514", "params": {"temperature": 0.7}}
    ]
)

response = router.completion(messages=[...])
router.report(success=True)

Each unique (model, params) combination is tracked separately. Kalibr learns which configuration works best for your specific goal.

Routing Between Tools

router = Router(
    goal="research_task",
    paths=[
        {"model": "gpt-4o", "tools": ["web_search"]},
        {"model": "gpt-4o", "tools": ["code_interpreter"]},
        {"model": "claude-sonnet-4-20250514"}
    ]
)

When to Use get_policy() Instead of Router

For most use cases, use Router. It handles provider dispatching and response conversion automatically.

Use get_policy() for advanced scenarios:

  • Integrating with frameworks like LangChain that wrap LLM calls
  • Custom retry logic or provider-specific features
  • Building tools that need fine-grained control
from kalibr import get_policy, report_outcome

policy = get_policy(goal="summarize")
model = policy["recommended_model"]

# You call the provider yourself
if model.startswith("gpt"):
    client = OpenAI()
    response = client.chat.completions.create(model=model, messages=[...])

report_outcome(trace_id=trace_id, goal="summarize", success=True)

Outcome Reporting

Automatic (with success_when)

router = Router(
    goal="summarize",
    paths=["gpt-4o", "claude-sonnet-4-20250514"],
    success_when=lambda output: len(output) > 100
)

response = router.completion(messages=[...])
# Outcome reported automatically based on success_when

Manual

router = Router(goal="book_meeting", paths=["gpt-4o", "claude-sonnet-4-20250514"])
response = router.completion(messages=[...])

meeting_created = check_calendar_api()
router.report(success=meeting_created)

LangChain Integration

pip install kalibr[langchain]
from kalibr import Router

router = Router(goal="summarize", paths=["gpt-4o", "claude-sonnet-4-20250514"])
llm = router.as_langchain()

chain = prompt | llm | parser

Auto-Instrumentation

Kalibr auto-instruments OpenAI, Anthropic, and Google SDKs on import:

import kalibr  # Must be first import
from openai import OpenAI

client = OpenAI()
response = client.chat.completions.create(model="gpt-4o", messages=[...])
# Traced automatically

Disable with KALIBR_AUTO_INSTRUMENT=false.

Low-Level API

For advanced use cases, you can use the intelligence API directly:

from kalibr import register_path, decide, report_outcome

# Register paths
register_path(goal="book_meeting", model_id="gpt-4o")
register_path(goal="book_meeting", model_id="claude-sonnet-4-20250514")

# Get routing decision
decision = decide(goal="book_meeting")
model = decision["model_id"]

# Make your own LLM call, then report
report_outcome(trace_id="...", goal="book_meeting", success=True)

Other Integrations

pip install kalibr[tokens]        # Accurate token counting (tiktoken)
pip install kalibr[crewai]        # CrewAI
pip install kalibr[openai-agents] # OpenAI Agents SDK
pip install kalibr[langchain-all] # LangChain with all providers

Configuration

Variable Description Default
KALIBR_API_KEY API key from dashboard Required
KALIBR_TENANT_ID Tenant ID from dashboard Required
KALIBR_AUTO_INSTRUMENT Auto-instrument LLM SDKs true
KALIBR_INTELLIGENCE_URL Intelligence service URL https://kalibr-intelligence.fly.dev

Development

git clone https://github.com/kalibr-ai/kalibr-sdk-python.git
cd kalibr-sdk-python
pip install -e ".[dev]"
pytest

Contributing

See CONTRIBUTING.md.

License

Apache-2.0

Links

Project details


Release history Release notifications | RSS feed

This version

1.4.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kalibr-1.4.2.tar.gz (102.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kalibr-1.4.2-py3-none-any.whl (106.9 kB view details)

Uploaded Python 3

File details

Details for the file kalibr-1.4.2.tar.gz.

File metadata

  • Download URL: kalibr-1.4.2.tar.gz
  • Upload date:
  • Size: 102.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.1

File hashes

Hashes for kalibr-1.4.2.tar.gz
Algorithm Hash digest
SHA256 df827be780b06582a595ff1b1f38935fb2e23fe953ea1c42d319cf3027669cdd
MD5 80bfa4b72bde6fd8a9f567ec3d21cb8a
BLAKE2b-256 40e7eba4ae35b9e6af7a124482cca177ea56cd607976b75e3fd01933d5d8c748

See more details on using hashes here.

File details

Details for the file kalibr-1.4.2-py3-none-any.whl.

File metadata

  • Download URL: kalibr-1.4.2-py3-none-any.whl
  • Upload date:
  • Size: 106.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.1

File hashes

Hashes for kalibr-1.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d60fbf8e9093eaba2dfd92ca7cc1086415dd5536f98466d8cc345794132c3d24
MD5 2b5a6ec1b595651e5e2bf56fca03d2c7
BLAKE2b-256 a2177597acfb6dd455814d4c4bb5aeb387d1b5388256bb600e767e65b0ac83aa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page