LiteLLM model integration for Pydantic AI framework - access 100+ LLM providers through a unified interface
Project description
Pydantic AI LiteLLM
A LiteLLM model integration for the Pydantic AI framework, enabling access to 100+ LLM providers through a unified interface.
Features
- Universal LLM Access: Connect to 100+ LLM providers (OpenAI, Anthropic, Cohere, Bedrock, Azure, and many more) via LiteLLM
- Full Pydantic AI Integration: Complete support for tool calling, streaming, structured outputs, and all Pydantic AI features
- Type Safety: Fully typed with comprehensive type hints
- Async/Await Support: Built for modern async Python applications
- Flexible Configuration: Support for custom API endpoints, headers, and provider-specific settings
Installation
pip install pydantic-ai-litellm
Quick Start
import asyncio
from pydantic_ai import Agent
from pydantic_ai_litellm import LiteLLMModel
# Initialize with any LiteLLM-supported model
model = LiteLLMModel(
model_name="gpt-4", # or claude-3-opus, gemini-pro, etc.
api_key="your-api-key" # will also check environment variables
)
# Create an agent
agent = Agent(model=model)
# Run inference
async def main():
result = await agent.run("What is the capital of France?")
print(result.output)
asyncio.run(main())
Supported Providers
This library supports all providers available through LiteLLM, including:
- OpenAI: GPT-4, GPT-3.5, o1, etc.
- Anthropic: Claude 3 (Opus, Sonnet, Haiku)
- Google: Gemini Pro, Gemini Flash
- AWS Bedrock: Claude, Titan, Cohere models
- Azure OpenAI: All Azure-hosted models
- Cohere: Command, Command R+
- Mistral AI: Mistral 7B, 8x7B, Large
- And 90+ more providers
See the LiteLLM providers documentation for the complete list.
Advanced Usage
Custom API Endpoints
model = LiteLLMModel(
model_name="custom-model",
api_base="https://your-custom-endpoint.com/v1",
api_key="your-api-key",
custom_llm_provider="openai" # specify provider format
)
Tool Calling
from pydantic_ai import Agent
from pydantic_ai_litellm import LiteLLMModel
def get_weather(location: str) -> str:
"""Get weather for a location."""
return f"It's sunny in {location}"
model = LiteLLMModel("gpt-4")
agent = Agent(model=model, tools=[get_weather])
result = await agent.run("What's the weather in Paris?")
Streaming
async with agent.run_stream("Write a poem about AI") as stream:
async for text in stream.stream_text(delta=True):
print(text, end="", flush=True)
Structured Output
from pydantic import BaseModel
class Person(BaseModel):
name: str
age: int
occupation: str
agent = Agent(model=model, output_type=Person)
result = await agent.run("Generate a person profile")
print(result.output.name) # Typed as Person
Configuration
You can configure the model with various settings:
from pydantic_ai_litellm import LiteLLMModelSettings
settings: LiteLLMModelSettings = {
'temperature': 0.7,
'max_tokens': 1000,
'litellm_api_key': 'your-key',
'litellm_api_base': 'https://custom-endpoint.com',
'extra_headers': {'Custom-Header': 'value'}
}
model = LiteLLMModel("gpt-4", settings=settings)
Requirements
- Python 3.13+
pydantic-ai-slim>=0.6.2litellm>=1.75.5
Contributing
Contributions are welcome! Please feel free to submit issues and pull requests.
License
MIT License - see LICENSE file for details.
Examples
See the examples/ directory for complete working examples:
- Quick Start (
examples/01_quick_start.py) - Basic usage - Custom Endpoints (
examples/02_custom_endpoints.py) - Using custom API endpoints - Tool Calling (
examples/03_tool_calling.py) - Functions as AI tools - Streaming (
examples/04_streaming.py) - Real-time text streaming - Structured Output (
examples/05_structured_output.py) - Typed responses with Pydantic - Configuration (
examples/06_configuration.py) - Model settings and parameters
Each example includes error handling and can be run independently with the appropriate API keys.
Links
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pydantic_ai_litellm-0.2.3.tar.gz.
File metadata
- Download URL: pydantic_ai_litellm-0.2.3.tar.gz
- Upload date:
- Size: 10.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f8222f2d03f9f75be87545fb5e568794bcc05bb6000d15db81cf8b30efe9bf4
|
|
| MD5 |
499046063c9cf9dff3752f584fc9bba0
|
|
| BLAKE2b-256 |
1782a48629d6f855b9c94d9a2cbc1d5291c7ad40f1ac8f99b723d8ab33cefc4b
|
Provenance
The following attestation bundles were made for pydantic_ai_litellm-0.2.3.tar.gz:
Publisher:
pypi-release.yml on mochow13/pydantic-ai-litellm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
pydantic_ai_litellm-0.2.3.tar.gz -
Subject digest:
7f8222f2d03f9f75be87545fb5e568794bcc05bb6000d15db81cf8b30efe9bf4 - Sigstore transparency entry: 709046421
- Sigstore integration time:
-
Permalink:
mochow13/pydantic-ai-litellm@05b71bf389160dc8a62095c6b8e15c01b37edd06 -
Branch / Tag:
refs/tags/v1.0.5 - Owner: https://github.com/mochow13
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-release.yml@05b71bf389160dc8a62095c6b8e15c01b37edd06 -
Trigger Event:
release
-
Statement type:
File details
Details for the file pydantic_ai_litellm-0.2.3-py3-none-any.whl.
File metadata
- Download URL: pydantic_ai_litellm-0.2.3-py3-none-any.whl
- Upload date:
- Size: 9.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fef9a29bc70a6db7c14608ea0aabc854fd75fc0e5bd773b1aff0eb6b5a9194bd
|
|
| MD5 |
7640432528cbe56daf69cc1ad50c773d
|
|
| BLAKE2b-256 |
526632f229dbfa6e86c69562ffe598858765d92c208bd4a848466d7dd1b9d4ae
|
Provenance
The following attestation bundles were made for pydantic_ai_litellm-0.2.3-py3-none-any.whl:
Publisher:
pypi-release.yml on mochow13/pydantic-ai-litellm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
pydantic_ai_litellm-0.2.3-py3-none-any.whl -
Subject digest:
fef9a29bc70a6db7c14608ea0aabc854fd75fc0e5bd773b1aff0eb6b5a9194bd - Sigstore transparency entry: 709046422
- Sigstore integration time:
-
Permalink:
mochow13/pydantic-ai-litellm@05b71bf389160dc8a62095c6b8e15c01b37edd06 -
Branch / Tag:
refs/tags/v1.0.5 - Owner: https://github.com/mochow13
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-release.yml@05b71bf389160dc8a62095c6b8e15c01b37edd06 -
Trigger Event:
release
-
Statement type: