Skip to main content

Keiro client — call the EB1 multi-model ensemble API.

Project description

Keiro

EB1 multi-model ensemble inference. Run multiple frontier models in parallel and synthesize the best response.

Quick start

pip install keiro
keiro setup
from keiro import models

print(models("eb1-preview", "What is machine learning?"))

Or from the command line:

keiro "What is machine learning?"

How it works

EB1 sends your prompt to multiple frontier models (Claude, GPT, Gemini) in parallel, then a judge synthesizes the strongest elements into a single response. The result is more accurate and more complete than any individual model.

Models

Model Description
eb1-preview (default) Adaptive GNN-routed ensemble
eb1-delta-preview Adaptive ensemble with orchestration
eb1 Standard 5-model ensemble
eb1-pro Extended 6-model ensemble
eb1-frontier Highest quality, max reasoning
eb1-codex Optimized for code and SWE tasks
eb1-fast Low latency, lighter models
eb1-fast-preview Adaptive routing, low latency
eb1-frontier-preview Adaptive routing, max quality
claude-opus-4-6 Direct passthrough (no ensemble)
gpt-5.2 Direct passthrough
from keiro import models

# Default adaptive ensemble
answer = models("eb1-preview", "Solve this step by step: what is 23 * 47?")

# Max quality
answer = models("eb1-frontier", "Prove that sqrt(2) is irrational.")

# Low latency
answer = models("eb1-fast", "Summarize this in one sentence.")

# Direct passthrough to a single model
answer = models("claude-opus-4-6", "Write a haiku")

Prompt-first API

from keiro import models

# Structured response with usage metadata
reply = models.response("eb1-preview", "Explain quantum computing.")
print(reply.text)
print(reply.usage)

# Reusable model binding with fixed parameters
creative = models.instance("eb1-preview", temperature=0.8)
print(creative("Write a limerick about debugging."))

# Streaming
for chunk in models.stream("eb1-preview", "Draft a launch email."):
    print(chunk, end="")

Full client

from keiro import Client

client = Client()

# Chat completions API
response = client.chat(
    messages=[{"role": "user", "content": "Explain quantum computing."}],
    model="eb1-preview",
)
print(response["choices"][0]["message"]["content"])

# Rate limit visibility
print(client.rate_limits)
# RateLimitInfo(limit_requests=1000, remaining_requests=999, ...)

client.close()

CLI

keiro "What is ML?"                 # one-shot response
keiro                               # interactive REPL
keiro -m eb1-fast "Quick answer"    # specific model
echo context | keiro "Summarize"    # pipe context as input
keiro setup                         # configure credentials
keiro models                        # list available models

Configuration

Interactive setup (recommended):

keiro setup

This validates your API key against the gateway and saves credentials to ~/.keiro/credentials.

Environment variables:

export KEIRO_API_KEY="your-api-key"

Explicit arguments:

from keiro import Client

client = Client(api_key="your-key")

Precedence: explicit arguments > environment variables > credentials file.

Requirements

  • Python 3.11+
  • No GPU required (inference runs on hosted infrastructure)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

keiro-0.5.2.tar.gz (34.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

keiro-0.5.2-py3-none-any.whl (38.8 kB view details)

Uploaded Python 3

File details

Details for the file keiro-0.5.2.tar.gz.

File metadata

  • Download URL: keiro-0.5.2.tar.gz
  • Upload date:
  • Size: 34.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.2

File hashes

Hashes for keiro-0.5.2.tar.gz
Algorithm Hash digest
SHA256 2e6e1afb50377dbc5b1ca02ce8e63249bb89d22e979c800593492569b2ef8525
MD5 1750110b9a3f64447a3383b78c92d69c
BLAKE2b-256 a4efa1be10ceccee2d8cd93ca05968fa85bef955e541ff7e8a36ebc2e2d03dd0

See more details on using hashes here.

File details

Details for the file keiro-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: keiro-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 38.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.2

File hashes

Hashes for keiro-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 3956155e9490874b046ea8c3e4b240cf27f21f9642ae324a0f3469b9cb1c832c
MD5 275f34578f3fe2317ae60d23e9099cb1
BLAKE2b-256 29083aef54fbc8e2852bfa2728466d8b63b56164a85b1d018544caa335acb46d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page