Skip to main content

Ollama and llama.cpp providers for the OpenAI Agents SDK

Project description

openai-agents-python-providers

Community model providers for the OpenAI Agents SDK.

Because OpenAI's SDK is intentionally focused on first-party integrations, this package provides ready-to-use ModelProvider implementations for locally-hosted and OpenAI-compatible backends:

Provider Backend
OllamaProvider Ollama
LlamaCppProvider llama.cpp, vLLM, and any OpenAI-compatible server

Installation

pip install openai-agents-python-providers

# or with temporal support
pip install "openai-agents-python-providers[temporal]"

Quickstart

Ollama

Make sure Ollama is running and you have a model pulled:

import asyncio
import os
from agents import Agent, Runner, RunConfig
from openai_agents_providers import OllamaProvider

# Configure via environment or parameters
provider = OllamaProvider(
    model=os.getenv("MODEL_NAME", "llama3.2"),
    base_url=os.getenv("PROVIDER_URL", "http://localhost:11434/v1")
)

agent = Agent(
    name="Assistant",
    instructions="You are a helpful assistant.",
)

async def main():
    result = await Runner.run(
        agent,
        "What is the capital of France?",
        run_config=RunConfig(model_provider=provider),
    )
    print(result.final_output)

asyncio.run(main())

llama.cpp

Start a llama.cpp server:

llama-server --model my-model.gguf --port 8080
import asyncio
import os
from agents import Agent, Runner, RunConfig
from openai_agents_providers import LlamaCppProvider

provider = LlamaCppProvider(
    base_url=os.getenv("PROVIDER_URL", "http://localhost:8080/v1"),
    model=os.getenv("MODEL_NAME"),  # optional
    api_key="sk-anything",
)

agent = Agent(
    name="Assistant",
    instructions="You are a helpful assistant.",
)

async def main():
    result = await Runner.run(
        agent,
        "Explain quantum entanglement in one sentence.",
        run_config=RunConfig(model_provider=provider),
    )
    print(result.final_output)

asyncio.run(main())

Temporal Integration

This package works seamlessly with the Temporal OpenAI Agents Plugin. You can use local providers like OllamaProvider or LlamaCppProvider while running agents durably in Temporal workflows.

See examples/temporal/ for a complete "tool-as-activity" demonstration.

# Install temporal dependencies
uv sync --group temporal

# Start the worker (pointing to your infrastructure)
TEMPORAL_ADDRESS="temporal.example.com:7233" \
PROVIDER_TYPE="ollama" \
MODEL_NAME="llama3.2" \
uv run examples/temporal/worker.py

# Start the workflow
TEMPORAL_ADDRESS="temporal.example.com:7233" \
uv run examples/temporal/starter.py "What is the weather where I am?"

API Reference

OllamaProvider

OllamaProvider(
    *,
    base_url: str = "http://localhost:11434/v1",
    model: str | None = None,
    api_key: str = "ollama",
    **kwargs,          # forwarded to AsyncOpenAI
)
Parameter Default Description
base_url http://localhost:11434/v1 Ollama API base URL
model None Model name (e.g. "llama3.2", "qwen3:8b"). Overrides any name passed by the agent.
api_key "ollama" Ignored by Ollama; required by the OpenAI SDK.

LlamaCppProvider

LlamaCppProvider(
    *,
    base_url: str,
    model: str | None = None,
    api_key: str = "sk-anything",
    **kwargs,          # forwarded to AsyncOpenAI
)
Parameter Default Description
base_url (required) OpenAI-compatible API base URL, e.g. http://localhost:8080/v1.
model None Model name. Overrides any name passed by the agent.
api_key "sk-anything" Ignored by most backends; required by the OpenAI SDK.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_agents_python_providers-1.0.0.tar.gz (130.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openai_agents_python_providers-1.0.0.tar.gz.

File metadata

File hashes

Hashes for openai_agents_python_providers-1.0.0.tar.gz
Algorithm Hash digest
SHA256 7778bc965ca6a9084d3a6ab37b2e28abe9a51c9a8fd546a25db0a9219a78a1ff
MD5 22fcfac0d8d2d4514ad8077156a4fd82
BLAKE2b-256 6c27cba4fa9b27f3fa090b4c6928e8ff788d329984a50ad0224bed0a096b542d

See more details on using hashes here.

Provenance

The following attestation bundles were made for openai_agents_python_providers-1.0.0.tar.gz:

Publisher: release.yml on GethosTheWalrus/openai-agents-python-providers

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file openai_agents_python_providers-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_agents_python_providers-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7a771a41ef86935a65c821cc9773936246af6dbdbf5fc13a243d83f61e53cd4f
MD5 6462b65f6951afa5282c2965e2130ba2
BLAKE2b-256 6be26ade0ff907cd438e394876ba0a650b48433ce6542ea3f0130282f9c76d3f

See more details on using hashes here.

Provenance

The following attestation bundles were made for openai_agents_python_providers-1.0.0-py3-none-any.whl:

Publisher: release.yml on GethosTheWalrus/openai-agents-python-providers

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page