Skip to main content

Universal LLM API client for Python. Unified interface for streaming, tool calling, and provider routing across 142+ LLM providers. Rust-powered.

Project description

Python

kreuzberg.dev

Universal LLM API client for Python. Access 142+ LLM providers — OpenAI, Anthropic, Groq, Mistral, and more — through a single unified interface. Native async/await support, streaming responses, tool calling, and type-safe API.

Installation

Package Installation

Install via pip:

pip install liter-llm

System Requirements

  • Python 3.10+ required
  • API keys via environment variables (e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY)

Quick Start

Basic Chat

Send a message to any provider using the provider/model prefix:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Hello!"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())

Common Use Cases

Streaming Responses

Stream tokens in real time:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    async for chunk in await client.chat_stream(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Tell me a story"}],
    ):
        if chunk.choices and chunk.choices[0].delta.content:
            print(chunk.choices[0].delta.content, end="", flush=True)
    print()

asyncio.run(main())

Tool Calling

Define and invoke tools:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])

    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "Get the current weather for a location",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {"type": "string", "description": "City name"},
                    },
                    "required": ["location"],
                },
            },
        }
    ]

    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "What is the weather in Berlin?"}],
        tools=tools,
    )

    choice = response.choices[0]
    if choice.message.tool_calls:
        for call in choice.message.tool_calls:
            print(f"Tool: {call.function.name}, Args: {call.function.arguments}")

asyncio.run(main())

Next Steps

Features

Supported Providers (142+)

Route to any provider using the provider/model prefix convention:

Provider Example Model
OpenAI openai/gpt-4o, openai/gpt-4o-mini
Anthropic anthropic/claude-3-5-sonnet-20241022
Groq groq/llama-3.1-70b-versatile
Mistral mistral/mistral-large-latest
Cohere cohere/command-r-plus
Together AI together/meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo
Fireworks fireworks/accounts/fireworks/models/llama-v3p1-70b-instruct
Google Vertex vertexai/gemini-1.5-pro
Amazon Bedrock bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0

Complete Provider List

Key Capabilities

  • Provider Routing -- Single client for 142+ LLM providers via provider/model prefix

  • Unified API -- Consistent chat, chat_stream, embeddings, list_models interface

  • Streaming -- Real-time token streaming via chat_stream

  • Tool Calling -- Function calling and tool use across all supporting providers

  • Type Safe -- Schema-driven types compiled from JSON schemas

  • Secure -- API keys never logged or serialized, managed via environment variables

  • Observability -- Built-in OpenTelemetry with GenAI semantic conventions

  • Error Handling -- Structured errors with provider context and retry hints

Performance

Built on a compiled Rust core for speed and safety:

  • Provider resolution at client construction -- zero per-request overhead
  • Configurable timeouts and connection pooling
  • Zero-copy streaming with SSE and AWS EventStream support
  • API keys wrapped in secure memory, zeroed on drop

Provider Routing

Route to 142+ providers using the provider/model prefix convention:

openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
groq/llama-3.1-70b-versatile
mistral/mistral-large-latest

See the provider registry for the full list.

Documentation

Part of kreuzberg.dev.

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

Join our Discord community for questions and discussion.

License

MIT -- see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

liter_llm-1.0.0rc7.tar.gz (220.2 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

liter_llm-1.0.0rc7-cp310-abi3-win_amd64.whl (3.9 MB view details)

Uploaded CPython 3.10+Windows x86-64

liter_llm-1.0.0rc7-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.17+ x86-64

liter_llm-1.0.0rc7-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.17+ ARM64

liter_llm-1.0.0rc7-cp310-abi3-macosx_11_0_arm64.whl (3.6 MB view details)

Uploaded CPython 3.10+macOS 11.0+ ARM64

File details

Details for the file liter_llm-1.0.0rc7.tar.gz.

File metadata

  • Download URL: liter_llm-1.0.0rc7.tar.gz
  • Upload date:
  • Size: 220.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for liter_llm-1.0.0rc7.tar.gz
Algorithm Hash digest
SHA256 aec11f60ff5f09f6a6727b5e6f15e7c76ba130c1aa84ed8f23c9fe8bccf44b40
MD5 9ec067e3b5386be876ea420c42cc0bac
BLAKE2b-256 476004d4c17014d112f863b6a706f9dd026301c346f05d65dbb416d948cb936f

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc7.tar.gz:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc7-cp310-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc7-cp310-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 cb519c16ecec92b08d5e1467e66b7cecd62b41dd4ae516fca96ff54171eb3834
MD5 a048a3be61a975d89bf1530381c070de
BLAKE2b-256 908d3777bcea0932bffd85eabc7fec8ef0e23b914edf03f44e2d14e31e699366

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc7-cp310-abi3-win_amd64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc7-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc7-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 00ebf0e915704486e500410c2b6aa7912f5500a39ac89dd67d8df84712f6f045
MD5 5afa38c24378d5d770a3986f23dcb42d
BLAKE2b-256 e3b46563abf887d0147108d8b0601f146ca55fc5ccc0f839d336893ae2c679be

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc7-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc7-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc7-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 bf3d2780b9f5621390c45f2626d59a1f636433076d2dc62971eb6454ad4e0af0
MD5 28b8e961b65fd6b0213ceae7606e612a
BLAKE2b-256 aeaef597c667c921315a3edb329885f4754c59b83e97b888704c3568a82ec1cd

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc7-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc7-cp310-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc7-cp310-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 32ad5ae4638883fc684256fafc5e423b87f64af0640e2101a87c1f3e9a7f5064
MD5 e6d4e781dd247bb110700da9da48317c
BLAKE2b-256 23aeda0ddf4444761051d859034fbbbde0bfeeb410e44b9b43bf96bb2e9ad353

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc7-cp310-abi3-macosx_11_0_arm64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page