Skip to main content

Multi-model provider implementations for OpenAI Agents, supporting both OpenAI and Ollama models

Project description

Timestep

Multi-model provider implementations for OpenAI Agents, supporting both OpenAI and Ollama models. Works with both local Ollama instances and Ollama Cloud.

Installation

pip install timestep

Quick Start

Using MultiModelProvider (Recommended)

The MultiModelProvider automatically routes requests to the appropriate provider based on model name prefixes:

from timestep import MultiModelProvider, MultiModelProviderMap
from agents import Agent, Runner, RunConfig
import os

# Create a provider map and add Ollama support
model_provider_map = MultiModelProviderMap()

if os.environ.get("OLLAMA_API_KEY"):
    from timestep import OllamaModelProvider
    model_provider_map.add_provider(
        "ollama",
        OllamaModelProvider(api_key=os.environ.get("OLLAMA_API_KEY"))
    )

# Create MultiModelProvider with OpenAI fallback
model_provider = MultiModelProvider(
    provider_map=model_provider_map,
    openai_api_key=os.environ.get("OPENAI_API_KEY", ""),
)

# Create agent with model name
agent = Agent(model="gpt-4")  # Uses OpenAI by default
# Or: agent = Agent(model="ollama/llama3")  # Uses Ollama

# Run agent with RunConfig
run_config = RunConfig(model_provider=model_provider)
result = Runner.run_streamed(agent, agent_input, run_config=run_config)

Using OllamaModelProvider Directly

from timestep import OllamaModelProvider
from agents import Agent, Runner, RunConfig

# Create an Ollama provider for local Ollama instance
ollama_provider = OllamaModelProvider()  # Defaults to localhost:11434

# For Ollama Cloud, use the API key
cloud_provider = OllamaModelProvider(api_key="your-ollama-cloud-key")

# Create agent and run
agent = Agent(model="llama3")
run_config = RunConfig(model_provider=ollama_provider)
result = Runner.run_streamed(agent, agent_input, run_config=run_config)

Custom Provider Mapping

from timestep import MultiModelProvider, MultiModelProviderMap, OllamaModelProvider
from agents import Agent, Runner, RunConfig
import os

# Create a custom mapping
model_provider_map = MultiModelProviderMap()

# Add Ollama provider
if os.environ.get("OLLAMA_API_KEY"):
    model_provider_map.add_provider(
        "ollama",
        OllamaModelProvider(api_key=os.environ.get("OLLAMA_API_KEY"))
    )

# Use the custom mapping
model_provider = MultiModelProvider(
    provider_map=model_provider_map,
    openai_api_key=os.environ.get("OPENAI_API_KEY", ""),
)

agent = Agent(model="ollama/llama3")
run_config = RunConfig(model_provider=model_provider)
result = Runner.run_streamed(agent, agent_input, run_config=run_config)

Components

MultiModelProvider

Automatically routes model requests to the appropriate provider based on model name prefixes. Supports both OpenAI and Ollama models out of the box.

Features:

  • Automatic provider selection based on model name prefix
  • Default fallback to OpenAI for unprefixed models
  • Support for custom provider mappings

OllamaModelProvider

Provides access to Ollama models (local or cloud).

Options:

  • api_key (str, optional): API key for Ollama Cloud
  • base_url (str, optional): Base URL for Ollama instance (defaults to http://localhost:11434 for local, https://ollama.com for cloud)
  • ollama_client (Any, optional): Custom Ollama client instance

Features:

  • Lazy client initialization (only loads when needed)
  • Automatic cloud detection for models ending with -cloud
  • Support for both local Ollama instances and Ollama Cloud
  • Seamless switching between local and cloud models

OllamaModel

Direct model implementation that converts Ollama responses to OpenAI-compatible format.

Features:

  • Converts Ollama API responses to OpenAI format
  • Supports streaming responses
  • Handles tool calls and function calling
  • Compatible with OpenAI Agents SDK

MultiModelProviderMap

Manages custom mappings of model name prefixes to providers.

Methods:

  • add_provider(prefix, provider): Add a prefix-to-provider mapping
  • remove_provider(prefix): Remove a mapping
  • get_provider(prefix): Get provider for a prefix
  • has_prefix(prefix): Check if prefix exists
  • get_mapping(): Get all mappings
  • set_mapping(mapping): Replace all mappings

Features

  • Multi-Model Support: Seamlessly switch between OpenAI and Ollama models
  • Automatic Routing: Model names with prefixes (e.g., ollama/llama3) automatically route to the correct provider
  • Customizable: Add your own providers using MultiModelProviderMap
  • OpenAI Compatible: Works with the OpenAI Agents SDK
  • Ollama Integration: Full support for both local Ollama instances and Ollama Cloud

Model Naming

  • Models without a prefix (e.g., gpt-4) default to OpenAI
  • Models with openai/ prefix (e.g., openai/gpt-4) use OpenAI
  • Models with ollama/ prefix (e.g., ollama/llama3) use Ollama

Requirements

  • Python >=3.11
  • ollama >=0.6.0
  • openai-agents >=0.4.2

Future Plans

We're actively developing additional features for the timestep library:

  • Additional Abstractions: Gradually abstracting out other logic from Timestep AI into reusable library components
  • CLI Tool: A proper command-line interface with tracing support for debugging and monitoring agent interactions

License

MIT

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

timestep-2026.0.3.tar.gz (46.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

timestep-2026.0.3-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file timestep-2026.0.3.tar.gz.

File metadata

  • Download URL: timestep-2026.0.3.tar.gz
  • Upload date:
  • Size: 46.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for timestep-2026.0.3.tar.gz
Algorithm Hash digest
SHA256 05478d1b52915ae4dd6a49dceb22dfca7d5de90eba95866f8715b1a9e71e62e9
MD5 eaba74a61b4e2b9b222e52fa0fb09baa
BLAKE2b-256 fa77ac7a68df163c5551d9fe060e7ab494db9ef706335961cfbf5a41ee8a3574

See more details on using hashes here.

File details

Details for the file timestep-2026.0.3-py3-none-any.whl.

File metadata

  • Download URL: timestep-2026.0.3-py3-none-any.whl
  • Upload date:
  • Size: 11.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for timestep-2026.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ff97b9aaef3953e6c6e300f5d7ac14bb2e4f14cb00d2d93cad61ed785bfde219
MD5 34d7ba4c7ce0d0f341a251221307b332
BLAKE2b-256 85b5d151ee90e4d4e6756e35dfb506eae3feab94318661b5053fb7e3dd022acd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page