Skip to main content

LiteLLM model integration for Pydantic AI framework - access 100+ LLM providers through a unified interface

Project description

Pydantic AI LiteLLM

PyPI Downloads

A LiteLLM model integration for the Pydantic AI framework, enabling access to 100+ LLM providers through a unified interface.

Features

  • Universal LLM Access: Connect to 100+ LLM providers (OpenAI, Anthropic, Cohere, Bedrock, Azure, and many more) via LiteLLM
  • Full Pydantic AI Integration: Complete support for tool calling, streaming, structured outputs, and all Pydantic AI features
  • Type Safety: Fully typed with comprehensive type hints
  • Async/Await Support: Built for modern async Python applications
  • Flexible Configuration: Support for custom API endpoints, headers, and provider-specific settings

Installation

pip install pydantic-ai-litellm

Quick Start

import asyncio
from pydantic_ai import Agent
from pydantic_ai_litellm import LiteLLMModel

# Initialize with any LiteLLM-supported model
model = LiteLLMModel(
    model_name="gpt-4",  # or claude-3-opus, gemini-pro, etc.
    api_key="your-api-key"  # will also check environment variables
)

# Create an agent
agent = Agent(model=model)

# Run inference
async def main():
    result = await agent.run("What is the capital of France?")
    print(result.output)

asyncio.run(main())

Supported Providers

This library supports all providers available through LiteLLM, including:

  • OpenAI: GPT-4, GPT-3.5, o1, etc.
  • Anthropic: Claude 3 (Opus, Sonnet, Haiku)
  • Google: Gemini Pro, Gemini Flash
  • AWS Bedrock: Claude, Titan, Cohere models
  • Azure OpenAI: All Azure-hosted models
  • Cohere: Command, Command R+
  • Mistral AI: Mistral 7B, 8x7B, Large
  • And 90+ more providers

See the LiteLLM providers documentation for the complete list.

Advanced Usage

Custom API Endpoints

model = LiteLLMModel(
    model_name="custom-model",
    api_base="https://your-custom-endpoint.com/v1",
    api_key="your-api-key",
    custom_llm_provider="openai"  # specify provider format
)

Tool Calling

from pydantic_ai import Agent
from pydantic_ai_litellm import LiteLLMModel

def get_weather(location: str) -> str:
    """Get weather for a location."""
    return f"It's sunny in {location}"

model = LiteLLMModel("gpt-4")
agent = Agent(model=model, tools=[get_weather])

result = await agent.run("What's the weather in Paris?")

Streaming

async with agent.run_stream("Write a poem about AI") as stream:
    async for text in stream.stream_text(delta=True):
        print(text, end="", flush=True)

Structured Output

from pydantic import BaseModel

class Person(BaseModel):
    name: str
    age: int
    occupation: str

agent = Agent(model=model, output_type=Person)
result = await agent.run("Generate a person profile")
print(result.output.name)  # Typed as Person

Configuration

You can configure the model with various settings:

from pydantic_ai_litellm import LiteLLMModelSettings

settings: LiteLLMModelSettings = {
    'temperature': 0.7,
    'max_tokens': 1000,
    'litellm_api_key': 'your-key',
    'litellm_api_base': 'https://custom-endpoint.com',
    'extra_headers': {'Custom-Header': 'value'}
}

model = LiteLLMModel("gpt-4", settings=settings)

Requirements

  • Python 3.13+
  • pydantic-ai-slim>=0.6.2
  • litellm>=1.75.5

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests.

License

MIT License - see LICENSE file for details.

Examples

See the examples/ directory for complete working examples:

  • Quick Start (examples/01_quick_start.py) - Basic usage
  • Custom Endpoints (examples/02_custom_endpoints.py) - Using custom API endpoints
  • Tool Calling (examples/03_tool_calling.py) - Functions as AI tools
  • Streaming (examples/04_streaming.py) - Real-time text streaming
  • Structured Output (examples/05_structured_output.py) - Typed responses with Pydantic
  • Configuration (examples/06_configuration.py) - Model settings and parameters

Each example includes error handling and can be run independently with the appropriate API keys.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_ai_litellm-0.2.4.tar.gz (10.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pydantic_ai_litellm-0.2.4-py3-none-any.whl (9.2 kB view details)

Uploaded Python 3

File details

Details for the file pydantic_ai_litellm-0.2.4.tar.gz.

File metadata

  • Download URL: pydantic_ai_litellm-0.2.4.tar.gz
  • Upload date:
  • Size: 10.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for pydantic_ai_litellm-0.2.4.tar.gz
Algorithm Hash digest
SHA256 fd474d69d84de0cd1e774621eaa8595949b0c0b49790a34f3ba3e98d73385334
MD5 f2f1cf1a526a54b07eaecc69b9a93c35
BLAKE2b-256 b6d4cc69d4b9c41d581cbed9f05633093672d1c7221f2d166550f3edd5c46859

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydantic_ai_litellm-0.2.4.tar.gz:

Publisher: pypi-release.yml on mochow13/pydantic-ai-litellm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pydantic_ai_litellm-0.2.4-py3-none-any.whl.

File metadata

File hashes

Hashes for pydantic_ai_litellm-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 a3ff4555202d845f9772908567bb54f16861200be033df4ee770d75fa52393d0
MD5 021b659d752f80b5d02c527462d50011
BLAKE2b-256 e8bd35cbcf9c6b6d83640d08e21ba39aa0d810721bf0bed6bd3fc8aeec73b496

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydantic_ai_litellm-0.2.4-py3-none-any.whl:

Publisher: pypi-release.yml on mochow13/pydantic-ai-litellm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page