Skip to main content

LiteLLM model integration for Pydantic AI framework - access 100+ LLM providers through a unified interface

Project description

Pydantic AI LiteLLM

A LiteLLM model integration for the Pydantic AI framework, enabling access to 100+ LLM providers through a unified interface.

Features

  • Universal LLM Access: Connect to 100+ LLM providers (OpenAI, Anthropic, Cohere, Bedrock, Azure, and many more) via LiteLLM
  • Full Pydantic AI Integration: Complete support for tool calling, streaming, structured outputs, and all Pydantic AI features
  • Type Safety: Fully typed with comprehensive type hints
  • Async/Await Support: Built for modern async Python applications
  • Flexible Configuration: Support for custom API endpoints, headers, and provider-specific settings

Installation

pip install pydantic-ai-litellm

Quick Start

import asyncio
from pydantic_ai import Agent
from pydantic_ai_litellm import LiteLLMModel

# Initialize with any LiteLLM-supported model
model = LiteLLMModel(
    model_name="gpt-4",  # or claude-3-opus, gemini-pro, etc.
    api_key="your-api-key"  # will also check environment variables
)

# Create an agent
agent = Agent(model=model)

# Run inference
async def main():
    result = await agent.run("What is the capital of France?")
    print(result.output)

asyncio.run(main())

Supported Providers

This library supports all providers available through LiteLLM, including:

  • OpenAI: GPT-4, GPT-3.5, o1, etc.
  • Anthropic: Claude 3 (Opus, Sonnet, Haiku)
  • Google: Gemini Pro, Gemini Flash
  • AWS Bedrock: Claude, Titan, Cohere models
  • Azure OpenAI: All Azure-hosted models
  • Cohere: Command, Command R+
  • Mistral AI: Mistral 7B, 8x7B, Large
  • And 90+ more providers

See the LiteLLM providers documentation for the complete list.

Advanced Usage

Custom API Endpoints

model = LiteLLMModel(
    model_name="custom-model",
    api_base="https://your-custom-endpoint.com/v1",
    api_key="your-api-key",
    custom_llm_provider="openai"  # specify provider format
)

Tool Calling

from pydantic_ai import Agent
from pydantic_ai_litellm import LiteLLMModel

def get_weather(location: str) -> str:
    """Get weather for a location."""
    return f"It's sunny in {location}"

model = LiteLLMModel("gpt-4")
agent = Agent(model=model, tools=[get_weather])

result = await agent.run("What's the weather in Paris?")

Streaming

async with agent.run_stream("Write a poem about AI") as stream:
    async for text in stream.stream_text(delta=True):
        print(text, end="", flush=True)

Structured Output

from pydantic import BaseModel

class Person(BaseModel):
    name: str
    age: int
    occupation: str

agent = Agent(model=model, output_type=Person)
result = await agent.run("Generate a person profile")
print(result.output.name)  # Typed as Person

Configuration

You can configure the model with various settings:

from pydantic_ai_litellm import LiteLLMModelSettings

settings: LiteLLMModelSettings = {
    'temperature': 0.7,
    'max_tokens': 1000,
    'litellm_api_key': 'your-key',
    'litellm_api_base': 'https://custom-endpoint.com',
    'extra_headers': {'Custom-Header': 'value'}
}

model = LiteLLMModel("gpt-4", settings=settings)

Requirements

  • Python 3.13+
  • pydantic-ai-slim>=0.6.2
  • litellm>=1.75.5

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests.

License

MIT License - see LICENSE file for details.

Examples

See the examples/ directory for complete working examples:

  • Quick Start (examples/01_quick_start.py) - Basic usage
  • Custom Endpoints (examples/02_custom_endpoints.py) - Using custom API endpoints
  • Tool Calling (examples/03_tool_calling.py) - Functions as AI tools
  • Streaming (examples/04_streaming.py) - Real-time text streaming
  • Structured Output (examples/05_structured_output.py) - Typed responses with Pydantic
  • Configuration (examples/06_configuration.py) - Model settings and parameters

Each example includes error handling and can be run independently with the appropriate API keys.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_ai_litellm-0.2.1.tar.gz (8.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pydantic_ai_litellm-0.2.1-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file pydantic_ai_litellm-0.2.1.tar.gz.

File metadata

  • Download URL: pydantic_ai_litellm-0.2.1.tar.gz
  • Upload date:
  • Size: 8.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pydantic_ai_litellm-0.2.1.tar.gz
Algorithm Hash digest
SHA256 03570f3612b09f9e50f32b5436882e0a89303421713215217a06af17b5481a45
MD5 307bc8cc75d3089f7b04c998c44c84f2
BLAKE2b-256 3f27900a425a04287666cbba19ac72b27c467ecec8387e91527a49d7e252cc41

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydantic_ai_litellm-0.2.1.tar.gz:

Publisher: pypi-release.yml on mochow13/pydantic-ai-litellm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pydantic_ai_litellm-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for pydantic_ai_litellm-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 de9259299ad0f91804e7d63bb04105076b0a47f0c74812016d610b04419a9d9f
MD5 56ab75640a71c6f0905454a458d1d4ae
BLAKE2b-256 cc684adeb7a416e6f1bf56a75623e677b391099c01c5c930b0fd461c31a2329e

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydantic_ai_litellm-0.2.1-py3-none-any.whl:

Publisher: pypi-release.yml on mochow13/pydantic-ai-litellm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page