Config-driven AI model switching made simple
Project description
Switchboard
Config-driven AI model switching made simple
Switchboard is a Python library that provides a unified API for switching between different AI models and providers with built-in fallback support. Configure once, switch seamlessly.
Features
- Unified API - Single interface for OpenAI, Anthropic, and more
- Task-based routing - Automatically select models based on task type
- Fallback chains - Automatic failover when primary models are unavailable
- Dynamic model discovery - Automatically fetches available models from provider APIs
- Configuration-driven - YAML-based configuration for easy management
- Environment-aware - Support for development, staging, and production configs
- Type-safe - Full type hints and Pydantic validation
Quick Start
Installation
# Base installation
pip install switchboard-ai
# With OpenAI support
pip install switchboard-ai[openai]
# With Anthropic support
pip install switchboard-ai[anthropic]
# With all providers
pip install switchboard-ai[all]
Basic Usage
- Create a configuration file
switchboard.yaml:
models:
gpt-4:
provider: openai
model_name: gpt-4
api_key_env: OPENAI_API_KEY
max_tokens: 4096
temperature: 0.7
claude-3:
provider: anthropic
model_name: claude-3-sonnet-20240229
api_key_env: ANTHROPIC_API_KEY
max_tokens: 4096
temperature: 0.7
tasks:
coding:
primary_model: gpt-4
fallback_models: [claude-3]
description: Code generation and programming
default_model: gpt-4
- Set your API keys:
export OPENAI_API_KEY="your-openai-key"
export ANTHROPIC_API_KEY="your-anthropic-key"
- Use in your code:
from switchboard import Client
# Initialize client
client = Client()
# Generate completion
response = client.complete("Write a Python function to calculate fibonacci numbers")
print(response.content)
# Use task-based routing
response = client.complete("Write a Python function", task="coding")
print(f"Used model: {response.model}")
# Override model for specific request
response = client.complete("Hello", model="claude-3")
Configuration
Models
Define available models with their provider configurations:
models:
model-name:
provider: openai|anthropic|local
model_name: actual-model-id
api_key_env: ENV_VAR_NAME
max_tokens: 4096
temperature: 0.7
timeout: 30
extra_params:
custom_param: value
Tasks
Configure task-based routing with fallback chains:
tasks:
task-name:
primary_model: model-name
fallback_models: [backup-model-1, backup-model-2]
description: "Task description"
Environment Variables
Set API keys in your environment:
# OpenAI
export OPENAI_API_KEY="sk-..."
# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
Advanced Usage
Async Support
import asyncio
async def main():
client = Client()
response = await client.complete_async("Hello world")
print(response.content)
asyncio.run(main())
Health Checks
# Check specific model
health = client.health_check("gpt-4")
print(f"GPT-4 healthy: {health['gpt-4']}")
# Check all models
health = client.health_check()
for model, status in health.items():
print(f"{model}: {'✓' if status else '✗'}")
Model Information
# List available models
models = client.list_models()
print(f"Available models: {models}")
# Get model details
info = client.get_model_info("gpt-4")
print(f"Context length: {info['context_length']}")
Configuration Management
# Reload configuration
client.reload_config()
# List configured tasks
tasks = client.list_tasks()
print(f"Available tasks: {tasks}")
Examples
See the examples/ directory for complete configuration examples:
config-dev.yaml- Development configurationconfig-prod.yaml- Production configuration
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Run tests (
pytest tests/) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Development
# Clone repository
git clone https://github.com/callmeumer/switchboard.git
cd switchboard
# Install in development mode
pip install -e ".[dev]"
# Run tests
pytest tests/
# Run linting
black . && flake8 . && mypy switchboard
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
- 📫 Issues: GitHub Issues
- 💬 Discussions: GitHub Discussions
- 🐦 Twitter: @callmeumer
Changelog
See CHANGELOG.md for a list of changes and version history.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file switchboard_ai-0.0.1.dev0.tar.gz.
File metadata
- Download URL: switchboard_ai-0.0.1.dev0.tar.gz
- Upload date:
- Size: 18.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
25359a7edd9b6c2c1faf3329de9899e2a14c8c1424931b22a782efbbd21d7870
|
|
| MD5 |
aeba446b1c922bd4ebe7528d260a83d7
|
|
| BLAKE2b-256 |
f93ef5cf10fba35ba85ef133f19295ced018bd71ab3175fbde0b2873e6733cf5
|
File details
Details for the file switchboard_ai-0.0.1.dev0-py3-none-any.whl.
File metadata
- Download URL: switchboard_ai-0.0.1.dev0-py3-none-any.whl
- Upload date:
- Size: 21.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f8382a4d728e41ea863d1348b99b09bd2c44b673321671ad4c149866986ea689
|
|
| MD5 |
b1f96ca1c8e6661b2783e65c6cb5e8a3
|
|
| BLAKE2b-256 |
88f21a7e22d33f2fc3216e9f7f694c1dc2a45d0c17a905be4b8a692a8db62a40
|