Skip to main content

Config-driven AI model switching made simple

Project description

Switchboard

Config-driven AI model switching made simple

Switchboard is a Python library that provides a unified API for switching between different AI models and providers with built-in fallback support. Configure once, switch seamlessly.

Features

  • Unified API - Single interface for OpenAI, Anthropic, and more
  • Task-based routing - Automatically select models based on task type
  • Fallback chains - Automatic failover when primary models are unavailable
  • Dynamic model discovery - Automatically fetches available models from provider APIs
  • Configuration-driven - YAML-based configuration for easy management
  • Environment-aware - Support for development, staging, and production configs
  • Type-safe - Full type hints and Pydantic validation

Quick Start

Installation

# Base installation
pip install switchboard-ai

# With OpenAI support
pip install switchboard-ai[openai]

# With Anthropic support
pip install switchboard-ai[anthropic]

# With all providers
pip install switchboard-ai[all]

Basic Usage

  1. Create a configuration file switchboard.yaml:
models:
  gpt-4:
    provider: openai
    model_name: gpt-4
    api_key_env: OPENAI_API_KEY
    max_tokens: 4096
    temperature: 0.7

  claude-3:
    provider: anthropic
    model_name: claude-3-sonnet-20240229
    api_key_env: ANTHROPIC_API_KEY
    max_tokens: 4096
    temperature: 0.7

tasks:
  coding:
    primary_model: gpt-4
    fallback_models: [claude-3]
    description: Code generation and programming

default_model: gpt-4
  1. Set your API keys:
export OPENAI_API_KEY="your-openai-key"
export ANTHROPIC_API_KEY="your-anthropic-key"
  1. Use in your code:
from switchboard import Client

# Initialize client
client = Client()

# Generate completion
response = client.complete("Write a Python function to calculate fibonacci numbers")
print(response.content)

# Use task-based routing
response = client.complete("Write a Python function", task="coding")
print(f"Used model: {response.model}")

# Override model for specific request
response = client.complete("Hello", model="claude-3")

Configuration

Models

Define available models with their provider configurations:

models:
  model-name:
    provider: openai|anthropic|local
    model_name: actual-model-id
    api_key_env: ENV_VAR_NAME
    max_tokens: 4096
    temperature: 0.7
    timeout: 30
    extra_params:
      custom_param: value

Tasks

Configure task-based routing with fallback chains:

tasks:
  task-name:
    primary_model: model-name
    fallback_models: [backup-model-1, backup-model-2]
    description: "Task description"

Environment Variables

Set API keys in your environment:

# OpenAI
export OPENAI_API_KEY="sk-..."

# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."

Advanced Usage

Async Support

import asyncio

async def main():
    client = Client()
    response = await client.complete_async("Hello world")
    print(response.content)

asyncio.run(main())

Health Checks

# Check specific model
health = client.health_check("gpt-4")
print(f"GPT-4 healthy: {health['gpt-4']}")

# Check all models
health = client.health_check()
for model, status in health.items():
    print(f"{model}: {'✓' if status else '✗'}")

Model Information

# List available models
models = client.list_models()
print(f"Available models: {models}")

# Get model details
info = client.get_model_info("gpt-4")
print(f"Context length: {info['context_length']}")

Configuration Management

# Reload configuration
client.reload_config()

# List configured tasks
tasks = client.list_tasks()
print(f"Available tasks: {tasks}")

Examples

See the examples/ directory for complete configuration examples:

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Run tests (pytest tests/)
  5. Commit your changes (git commit -m 'Add amazing feature')
  6. Push to the branch (git push origin feature/amazing-feature)
  7. Open a Pull Request

Development

# Clone repository
git clone https://github.com/callmeumer/switchboard.git
cd switchboard

# Install in development mode
pip install -e ".[dev]"

# Run tests
pytest tests/

# Run linting
black . && flake8 . && mypy switchboard

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

Changelog

See CHANGELOG.md for a list of changes and version history.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

switchboard_ai-0.1.0.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

switchboard_ai-0.1.0-py3-none-any.whl (21.5 kB view details)

Uploaded Python 3

File details

Details for the file switchboard_ai-0.1.0.tar.gz.

File metadata

  • Download URL: switchboard_ai-0.1.0.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for switchboard_ai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1dabf7fb6519f7e08d6a7bc84a1bee66ec794652c1202b339759b01e97f9f7e4
MD5 853a1595023ae4f6e1752f7f4b52acd8
BLAKE2b-256 da3d3f7d8574cb63b1d7442fc32e8e6e5dbd810617d734d0b8d832a5f4e15633

See more details on using hashes here.

File details

Details for the file switchboard_ai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: switchboard_ai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 21.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for switchboard_ai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fa425c89efff3909428f54d0d779178ec579f61760be95dcbdf3a1a60cc8337c
MD5 2f384d7535e9991d5f854e2dcdcbb1dd
BLAKE2b-256 ef241477591139e7580a889c0cd47063cdaa7aebbfd6f256691acd51f2ddcce2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page