Ollama and llama.cpp providers for the OpenAI Agents SDK
Project description
openai-agents-python-providers
Community model providers for the OpenAI Agents SDK.
Because OpenAI's SDK is intentionally focused on first-party integrations, this package provides ready-to-use ModelProvider implementations for locally-hosted and OpenAI-compatible backends:
| Provider | Backend |
|---|---|
OllamaProvider |
Ollama |
LlamaCppProvider |
llama.cpp, vLLM, and any OpenAI-compatible server |
Installation
pip install openai-agents-python-providers
# or with temporal support
pip install "openai-agents-python-providers[temporal]"
Quickstart
Ollama
Make sure Ollama is running and you have a model pulled:
import asyncio
import os
from agents import Agent, Runner, RunConfig
from openai_agents_providers import OllamaProvider
# Configure via environment or parameters
provider = OllamaProvider(
model=os.getenv("MODEL_NAME", "llama3.2"),
base_url=os.getenv("PROVIDER_URL", "http://localhost:11434/v1")
)
agent = Agent(
name="Assistant",
instructions="You are a helpful assistant.",
)
async def main():
result = await Runner.run(
agent,
"What is the capital of France?",
run_config=RunConfig(model_provider=provider),
)
print(result.final_output)
asyncio.run(main())
llama.cpp
Start a llama.cpp server:
llama-server --model my-model.gguf --port 8080
import asyncio
import os
from agents import Agent, Runner, RunConfig
from openai_agents_providers import LlamaCppProvider
provider = LlamaCppProvider(
base_url=os.getenv("PROVIDER_URL", "http://localhost:8080/v1"),
model=os.getenv("MODEL_NAME"), # optional
api_key="sk-anything",
)
agent = Agent(
name="Assistant",
instructions="You are a helpful assistant.",
)
async def main():
result = await Runner.run(
agent,
"Explain quantum entanglement in one sentence.",
run_config=RunConfig(model_provider=provider),
)
print(result.final_output)
asyncio.run(main())
Temporal Integration
This package works seamlessly with the Temporal OpenAI Agents Plugin. You can use local providers like OllamaProvider or LlamaCppProvider while running agents durably in Temporal workflows.
See examples/temporal/ for a complete "tool-as-activity" demonstration.
# Install temporal dependencies
uv sync --group temporal
# Start the worker (pointing to your infrastructure)
TEMPORAL_ADDRESS="temporal.example.com:7233" \
PROVIDER_TYPE="ollama" \
MODEL_NAME="llama3.2" \
uv run examples/temporal/worker.py
# Start the workflow
TEMPORAL_ADDRESS="temporal.example.com:7233" \
uv run examples/temporal/starter.py "What is the weather where I am?"
API Reference
OllamaProvider
OllamaProvider(
*,
base_url: str = "http://localhost:11434/v1",
model: str | None = None,
api_key: str = "ollama",
**kwargs, # forwarded to AsyncOpenAI
)
| Parameter | Default | Description |
|---|---|---|
base_url |
http://localhost:11434/v1 |
Ollama API base URL |
model |
None |
Model name (e.g. "llama3.2", "qwen3:8b"). Overrides any name passed by the agent. |
api_key |
"ollama" |
Ignored by Ollama; required by the OpenAI SDK. |
LlamaCppProvider
LlamaCppProvider(
*,
base_url: str,
model: str | None = None,
api_key: str = "sk-anything",
**kwargs, # forwarded to AsyncOpenAI
)
| Parameter | Default | Description |
|---|---|---|
base_url |
(required) | OpenAI-compatible API base URL, e.g. http://localhost:8080/v1. |
model |
None |
Model name. Overrides any name passed by the agent. |
api_key |
"sk-anything" |
Ignored by most backends; required by the OpenAI SDK. |
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openai_agents_python_providers-1.0.0.tar.gz.
File metadata
- Download URL: openai_agents_python_providers-1.0.0.tar.gz
- Upload date:
- Size: 130.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7778bc965ca6a9084d3a6ab37b2e28abe9a51c9a8fd546a25db0a9219a78a1ff
|
|
| MD5 |
22fcfac0d8d2d4514ad8077156a4fd82
|
|
| BLAKE2b-256 |
6c27cba4fa9b27f3fa090b4c6928e8ff788d329984a50ad0224bed0a096b542d
|
Provenance
The following attestation bundles were made for openai_agents_python_providers-1.0.0.tar.gz:
Publisher:
release.yml on GethosTheWalrus/openai-agents-python-providers
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
openai_agents_python_providers-1.0.0.tar.gz -
Subject digest:
7778bc965ca6a9084d3a6ab37b2e28abe9a51c9a8fd546a25db0a9219a78a1ff - Sigstore transparency entry: 1487951698
- Sigstore integration time:
-
Permalink:
GethosTheWalrus/openai-agents-python-providers@1417c9081786860cb2e0f2316d852a93d3077955 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/GethosTheWalrus
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1417c9081786860cb2e0f2316d852a93d3077955 -
Trigger Event:
push
-
Statement type:
File details
Details for the file openai_agents_python_providers-1.0.0-py3-none-any.whl.
File metadata
- Download URL: openai_agents_python_providers-1.0.0-py3-none-any.whl
- Upload date:
- Size: 7.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7a771a41ef86935a65c821cc9773936246af6dbdbf5fc13a243d83f61e53cd4f
|
|
| MD5 |
6462b65f6951afa5282c2965e2130ba2
|
|
| BLAKE2b-256 |
6be26ade0ff907cd438e394876ba0a650b48433ce6542ea3f0130282f9c76d3f
|
Provenance
The following attestation bundles were made for openai_agents_python_providers-1.0.0-py3-none-any.whl:
Publisher:
release.yml on GethosTheWalrus/openai-agents-python-providers
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
openai_agents_python_providers-1.0.0-py3-none-any.whl -
Subject digest:
7a771a41ef86935a65c821cc9773936246af6dbdbf5fc13a243d83f61e53cd4f - Sigstore transparency entry: 1487951797
- Sigstore integration time:
-
Permalink:
GethosTheWalrus/openai-agents-python-providers@1417c9081786860cb2e0f2316d852a93d3077955 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/GethosTheWalrus
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1417c9081786860cb2e0f2316d852a93d3077955 -
Trigger Event:
push
-
Statement type: