Skip to main content

Universal LLM API client for Python. Unified interface for streaming, tool calling, and provider routing across 142+ LLM providers. Rust-powered.

Project description

Python

kreuzberg.dev

Universal LLM API client for Python. Access 142+ LLM providers — OpenAI, Anthropic, Groq, Mistral, and more — through a single unified interface. Native async/await support, streaming responses, tool calling, and type-safe API.

Installation

Package Installation

Install via pip:

pip install liter-llm

System Requirements

  • Python 3.10+ required
  • API keys via environment variables (e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY)

Quick Start

Basic Chat

Send a message to any provider using the provider/model prefix:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Hello!"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())

Common Use Cases

Streaming Responses

Stream tokens in real time:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    async for chunk in await client.chat_stream(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Tell me a story"}],
    ):
        if chunk.choices and chunk.choices[0].delta.content:
            print(chunk.choices[0].delta.content, end="", flush=True)
    print()

asyncio.run(main())

Tool Calling

Define and invoke tools:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])

    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "Get the current weather for a location",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {"type": "string", "description": "City name"},
                    },
                    "required": ["location"],
                },
            },
        }
    ]

    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "What is the weather in Berlin?"}],
        tools=tools,
    )

    choice = response.choices[0]
    if choice.message.tool_calls:
        for call in choice.message.tool_calls:
            print(f"Tool: {call.function.name}, Args: {call.function.arguments}")

asyncio.run(main())

Next Steps

Features

Supported Providers (142+)

Route to any provider using the provider/model prefix convention:

Provider Example Model
OpenAI openai/gpt-4o, openai/gpt-4o-mini
Anthropic anthropic/claude-3-5-sonnet-20241022
Groq groq/llama-3.1-70b-versatile
Mistral mistral/mistral-large-latest
Cohere cohere/command-r-plus
Together AI together/meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo
Fireworks fireworks/accounts/fireworks/models/llama-v3p1-70b-instruct
Google Vertex vertexai/gemini-1.5-pro
Amazon Bedrock bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0

Complete Provider List

Key Capabilities

  • Provider Routing -- Single client for 142+ LLM providers via provider/model prefix

  • Unified API -- Consistent chat, chat_stream, embeddings, list_models interface

  • Streaming -- Real-time token streaming via chat_stream

  • Tool Calling -- Function calling and tool use across all supporting providers

  • Type Safe -- Schema-driven types compiled from JSON schemas

  • Secure -- API keys never logged or serialized, managed via environment variables

  • Observability -- Built-in OpenTelemetry with GenAI semantic conventions

  • Error Handling -- Structured errors with provider context and retry hints

Performance

Built on a compiled Rust core for speed and safety:

  • Provider resolution at client construction -- zero per-request overhead
  • Configurable timeouts and connection pooling
  • Zero-copy streaming with SSE and AWS EventStream support
  • API keys wrapped in secure memory, zeroed on drop

Provider Routing

Route to 142+ providers using the provider/model prefix convention:

openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
groq/llama-3.1-70b-versatile
mistral/mistral-large-latest

See the provider registry for the full list.

Documentation

Part of kreuzberg.dev.

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

Join our Discord community for questions and discussion.

License

MIT -- see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

liter_llm-1.0.0rc9.tar.gz (232.7 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

liter_llm-1.0.0rc9-cp310-abi3-win_amd64.whl (3.9 MB view details)

Uploaded CPython 3.10+Windows x86-64

liter_llm-1.0.0rc9-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.17+ x86-64

liter_llm-1.0.0rc9-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.17+ ARM64

liter_llm-1.0.0rc9-cp310-abi3-macosx_11_0_arm64.whl (3.6 MB view details)

Uploaded CPython 3.10+macOS 11.0+ ARM64

File details

Details for the file liter_llm-1.0.0rc9.tar.gz.

File metadata

  • Download URL: liter_llm-1.0.0rc9.tar.gz
  • Upload date:
  • Size: 232.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for liter_llm-1.0.0rc9.tar.gz
Algorithm Hash digest
SHA256 788a3c80ab8e997d8bf6cf713d1d4e72c7d0cbe9fc265fcb62f37d669d08cd5c
MD5 c87db00687d27510310bdb0c93860886
BLAKE2b-256 da8a3f6174238ec13361f8aba72cd233c2cbb21cb3018f2081790be1828ea268

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc9.tar.gz:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc9-cp310-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc9-cp310-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 e10790a32bf058bbb350b138d5d6713833895e4d3c76cc1790e544a2e03ac828
MD5 2ce6d2092c841fd55f886df0f5364e39
BLAKE2b-256 455d89dd3393e4efe89e9f65f056b3488426b1bf04f69fe0a8ed1710d3442e0c

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc9-cp310-abi3-win_amd64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc9-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc9-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 60311ada71b7dbf4ca03d33e2912d51d82dc330d1d11985696902c6ff2985c56
MD5 8b0a6de7748a5226b32a6a872bfe9fc9
BLAKE2b-256 24d7e03b241d830a1c1fb8046fac7af2f4eb45a45f6b7842d9bbc3e6212be02e

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc9-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc9-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc9-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 8f7baafe87e7489e1d526ebbc440ca0dda8ed0c0eed5b4b4d92918cca860c3c7
MD5 61093863d233853cfce0636ab2c9d68d
BLAKE2b-256 0772636384028f238e0e4f17e837acc2348b8faab056623cb093b2df442bcdd0

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc9-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc9-cp310-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc9-cp310-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 dfa7359a0e669683e764850586cc92f6484cead65ec5d60b15b3c2d760d75f9f
MD5 6c67db49b18e8261eb83d2277148071e
BLAKE2b-256 d68578ac3287d2f132252f4409a7b3e2d6681d7775e888fb14865dcb48e9115f

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc9-cp310-abi3-macosx_11_0_arm64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page