Skip to main content

Unified LLM Observability & Multi-Model AI Integration Framework - Deploy to GPT, Claude, Gemini, Copilot with full telemetry.

Project description

Kalibr SDK

Intelligent routing for AI agents. Kalibr picks the best model for each request, learns from outcomes, and shifts traffic to what works.

Installation

pip install kalibr
export KALIBR_API_KEY=kal_xxx  # Get from dashboard.kalibr.dev

Quick Start

from kalibr import Router

router = Router(
    goal="book_meeting",
    paths=["gpt-4o", "claude-3-sonnet", "gpt-4o-mini"],
    success_when=lambda output: "confirmed" in output.lower()
)

response = router.completion(
    messages=[{"role": "user", "content": "Book a meeting with John tomorrow"}]
)

print(response.choices[0].message.content)

That's it. Kalibr handles:

  • ✅ Picking the best model (Thompson Sampling)
  • ✅ Making the API call
  • ✅ Checking success
  • ✅ Learning for next time
  • ✅ Tracing everything

How It Works

  1. Define a goal - What is your agent trying to do?
  2. Register paths - Which models/tools can achieve it?
  3. Report outcomes - Did it work?
  4. Kalibr routes - Traffic shifts to winners

Paths

A path is a model + optional tools + optional params:

# Simple: just models
paths = ["gpt-4o", "claude-3-sonnet"]

# With tools
paths = [
    {"model": "gpt-4o", "tools": ["web_search"]},
    {"model": "claude-3-sonnet", "tools": ["web_search", "browser"]},
]

# With params
paths = [
    {"model": "gpt-4o", "params": {"temperature": 0.7}},
    {"model": "gpt-4o", "params": {"temperature": 0.2}},
]

Success Criteria

Auto-detect from output

router = Router(
    goal="summarize",
    paths=["gpt-4o", "claude-3-sonnet"],
    success_when=lambda output: len(output) > 100
)

Manual reporting

router = Router(goal="book_meeting", paths=["gpt-4o", "claude-3-sonnet"])

response = router.completion(messages=[...])

# Your verification logic
meeting_created = check_calendar_api()

router.report(success=meeting_created)

Framework Integration

LangChain

from kalibr import Router

router = Router(goal="summarize", paths=["gpt-4o", "claude-3-sonnet"])
llm = router.as_langchain()

chain = prompt | llm | parser
result = chain.invoke({"text": "..."})

CrewAI

from kalibr import Router

router = Router(goal="research", paths=["gpt-4o", "claude-3-sonnet"])

agent = Agent(
    role="Researcher",
    llm=router.as_langchain(),
    ...
)

Observability (Included)

Every call is automatically traced:

  • Token counts and costs
  • Latency (p50, p95, p99)
  • Tool usage
  • Errors with stack traces

View in the dashboard or use callback handlers directly:

from kalibr_langchain import KalibrCallbackHandler

handler = KalibrCallbackHandler()
chain.invoke({"input": "..."}, config={"callbacks": [handler]})

Pricing

Tier Routing Decisions Price
Free 1,000/month $0
Pro 50,000/month $49/month
Enterprise Unlimited Custom

API Reference

Router

Router(
    goal: str,                    # Required: name of the goal
    paths: List[str | dict],      # Models/tools to route between
    success_when: Callable,       # Optional: auto-evaluate success
    exploration_rate: float,      # Optional: 0.0-1.0, default 0.1
)

Methods

router.completion(messages, **kwargs)  # Make routed request
router.report(success, reason=None)    # Report outcome manually
router.add_path(model, tools=None)     # Add path dynamically
router.as_langchain()                  # Get LangChain-compatible LLM

Links

License

MIT

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kalibr-1.2.4.tar.gz (87.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kalibr-1.2.4-py3-none-any.whl (100.1 kB view details)

Uploaded Python 3

File details

Details for the file kalibr-1.2.4.tar.gz.

File metadata

  • Download URL: kalibr-1.2.4.tar.gz
  • Upload date:
  • Size: 87.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kalibr-1.2.4.tar.gz
Algorithm Hash digest
SHA256 0f212978ebf555b5aaac57084a9f26896620b9513f28c8a3612296e9215bf39b
MD5 ff883aca502c9382958a67c4d5501c86
BLAKE2b-256 ab884a9207da2cac2e985eea279738051dd2d6634745a31d765e3c29c4bfbdc4

See more details on using hashes here.

Provenance

The following attestation bundles were made for kalibr-1.2.4.tar.gz:

Publisher: publish.yml on kalibr-ai/kalibr-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kalibr-1.2.4-py3-none-any.whl.

File metadata

  • Download URL: kalibr-1.2.4-py3-none-any.whl
  • Upload date:
  • Size: 100.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kalibr-1.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 5e3a062a06e462116553c2661f8474f1aedc92c4eb539bb07dea96afb8837870
MD5 9e943f808a357e5575e73bd6ad627e64
BLAKE2b-256 7a4633250aa2731209253abdeeac90677b271b5f1fd5dbdc225e5ee3615ccb64

See more details on using hashes here.

Provenance

The following attestation bundles were made for kalibr-1.2.4-py3-none-any.whl:

Publisher: publish.yml on kalibr-ai/kalibr-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page