Multi-model provider implementations for OpenAI Agents, supporting both OpenAI and Ollama models
Project description
Timestep
Multi-model provider implementations for OpenAI Agents, supporting both OpenAI and Ollama models. Works with both local Ollama instances and Ollama Cloud.
Installation
pip install timestep
Quick Start
Using MultiModelProvider (Recommended)
The MultiModelProvider automatically routes requests to the appropriate provider based on model name prefixes:
from timestep import MultiModelProvider, MultiModelProviderMap
from agents import Agent, Runner, RunConfig
import os
# Create a provider map and add Ollama support
model_provider_map = MultiModelProviderMap()
if os.environ.get("OLLAMA_API_KEY"):
from timestep import OllamaModelProvider
model_provider_map.add_provider(
"ollama",
OllamaModelProvider(api_key=os.environ.get("OLLAMA_API_KEY"))
)
# Create MultiModelProvider with OpenAI fallback
model_provider = MultiModelProvider(
provider_map=model_provider_map,
openai_api_key=os.environ.get("OPENAI_API_KEY", ""),
)
# Create agent with model name
agent = Agent(model="gpt-4") # Uses OpenAI by default
# Or: agent = Agent(model="ollama/llama3") # Uses Ollama
# Run agent with RunConfig
run_config = RunConfig(model_provider=model_provider)
result = Runner.run_streamed(agent, agent_input, run_config=run_config)
Using OllamaModelProvider Directly
from timestep import OllamaModelProvider
from agents import Agent, Runner, RunConfig
# Create an Ollama provider for local Ollama instance
ollama_provider = OllamaModelProvider() # Defaults to localhost:11434
# For Ollama Cloud, use the API key
cloud_provider = OllamaModelProvider(api_key="your-ollama-cloud-key")
# Create agent and run
agent = Agent(model="llama3")
run_config = RunConfig(model_provider=ollama_provider)
result = Runner.run_streamed(agent, agent_input, run_config=run_config)
Custom Provider Mapping
from timestep import MultiModelProvider, MultiModelProviderMap, OllamaModelProvider
from agents import Agent, Runner, RunConfig
import os
# Create a custom mapping
model_provider_map = MultiModelProviderMap()
# Add Ollama provider
if os.environ.get("OLLAMA_API_KEY"):
model_provider_map.add_provider(
"ollama",
OllamaModelProvider(api_key=os.environ.get("OLLAMA_API_KEY"))
)
# Use the custom mapping
model_provider = MultiModelProvider(
provider_map=model_provider_map,
openai_api_key=os.environ.get("OPENAI_API_KEY", ""),
)
agent = Agent(model="ollama/llama3")
run_config = RunConfig(model_provider=model_provider)
result = Runner.run_streamed(agent, agent_input, run_config=run_config)
Components
MultiModelProvider
Automatically routes model requests to the appropriate provider based on model name prefixes. Supports both OpenAI and Ollama models out of the box.
Features:
- Automatic provider selection based on model name prefix
- Default fallback to OpenAI for unprefixed models
- Support for custom provider mappings
OllamaModelProvider
Provides access to Ollama models (local or cloud).
Options:
api_key(str, optional): API key for Ollama Cloudbase_url(str, optional): Base URL for Ollama instance (defaults tohttp://localhost:11434for local,https://ollama.comfor cloud)ollama_client(Any, optional): Custom Ollama client instance
Features:
- Lazy client initialization (only loads when needed)
- Automatic cloud detection for models ending with
-cloud - Support for both local Ollama instances and Ollama Cloud
- Seamless switching between local and cloud models
OllamaModel
Direct model implementation that converts Ollama responses to OpenAI-compatible format.
Features:
- Converts Ollama API responses to OpenAI format
- Supports streaming responses
- Handles tool calls and function calling
- Compatible with OpenAI Agents SDK
MultiModelProviderMap
Manages custom mappings of model name prefixes to providers.
Methods:
add_provider(prefix, provider): Add a prefix-to-provider mappingremove_provider(prefix): Remove a mappingget_provider(prefix): Get provider for a prefixhas_prefix(prefix): Check if prefix existsget_mapping(): Get all mappingsset_mapping(mapping): Replace all mappings
Features
- Multi-Model Support: Seamlessly switch between OpenAI and Ollama models
- Automatic Routing: Model names with prefixes (e.g.,
ollama/llama3) automatically route to the correct provider - Customizable: Add your own providers using
MultiModelProviderMap - OpenAI Compatible: Works with the OpenAI Agents SDK
- Ollama Integration: Full support for both local Ollama instances and Ollama Cloud
Model Naming
- Models without a prefix (e.g.,
gpt-4) default to OpenAI - Models with
openai/prefix (e.g.,openai/gpt-4) use OpenAI - Models with
ollama/prefix (e.g.,ollama/llama3) use Ollama
Requirements
- Python >=3.11
- ollama >=0.6.0
- openai-agents >=0.4.2
Future Plans
We're actively developing additional features for the timestep library:
- Additional Abstractions: Gradually abstracting out other logic from Timestep AI into reusable library components
- CLI Tool: A proper command-line interface with tracing support for debugging and monitoring agent interactions
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file timestep-2026.0.5.tar.gz.
File metadata
- Download URL: timestep-2026.0.5.tar.gz
- Upload date:
- Size: 329.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0b3051b31abe187819d3df066ab4c16f10cfb97ee84d42461bf036c2da2e8fda
|
|
| MD5 |
7e8c9f9b5121b9b79852fc7108e28ec4
|
|
| BLAKE2b-256 |
f10844b5a83d45d9bf1a74bba1a11cc624d45c03611195399a88da1286a95dc7
|
File details
Details for the file timestep-2026.0.5-py3-none-any.whl.
File metadata
- Download URL: timestep-2026.0.5-py3-none-any.whl
- Upload date:
- Size: 298.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
255edf37365e2949a01ad206f06e5f5dc5d3ba57e830abac1709472dfb943b01
|
|
| MD5 |
f3085576a4c2d1ce8f17955948ad1e44
|
|
| BLAKE2b-256 |
9c59cd0e671e0cd7d2c4f6b0b48f6802d811d00ac12a0dc5ff012bcec041a246
|