Skip to main content

Universal LLM API client for Python. Unified interface for streaming, tool calling, and provider routing across 142+ LLM providers. Rust-powered.

Project description

Python

kreuzberg.dev

Universal LLM API client for Python. Access 142+ LLM providers — OpenAI, Anthropic, Groq, Mistral, and more — through a single unified interface. Native async/await support, streaming responses, tool calling, and type-safe API.

Installation

Package Installation

Install via pip:

pip install liter-llm

System Requirements

  • Python 3.10+ required
  • API keys via environment variables (e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY)

Quick Start

Basic Chat

Send a message to any provider using the provider/model prefix:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Hello!"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())

Common Use Cases

Streaming Responses

Stream tokens in real time:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    async for chunk in await client.chat_stream(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Tell me a story"}],
    ):
        if chunk.choices and chunk.choices[0].delta.content:
            print(chunk.choices[0].delta.content, end="", flush=True)
    print()

asyncio.run(main())

Tool Calling

Define and invoke tools:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])

    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "Get the current weather for a location",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {"type": "string", "description": "City name"},
                    },
                    "required": ["location"],
                },
            },
        }
    ]

    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "What is the weather in Berlin?"}],
        tools=tools,
    )

    choice = response.choices[0]
    if choice.message.tool_calls:
        for call in choice.message.tool_calls:
            print(f"Tool: {call.function.name}, Args: {call.function.arguments}")

asyncio.run(main())

Next Steps

Features

Supported Providers (142+)

Route to any provider using the provider/model prefix convention:

Provider Example Model
OpenAI openai/gpt-4o, openai/gpt-4o-mini
Anthropic anthropic/claude-3-5-sonnet-20241022
Groq groq/llama-3.1-70b-versatile
Mistral mistral/mistral-large-latest
Cohere cohere/command-r-plus
Together AI together/meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo
Fireworks fireworks/accounts/fireworks/models/llama-v3p1-70b-instruct
Google Vertex vertexai/gemini-1.5-pro
Amazon Bedrock bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0

Complete Provider List

Key Capabilities

  • Provider Routing -- Single client for 142+ LLM providers via provider/model prefix

  • Unified API -- Consistent chat, chat_stream, embeddings, list_models interface

  • Streaming -- Real-time token streaming via chat_stream

  • Tool Calling -- Function calling and tool use across all supporting providers

  • Type Safe -- Schema-driven types compiled from JSON schemas

  • Secure -- API keys never logged or serialized, managed via environment variables

  • Observability -- Built-in OpenTelemetry with GenAI semantic conventions

  • Error Handling -- Structured errors with provider context and retry hints

Performance

Built on a compiled Rust core for speed and safety:

  • Provider resolution at client construction -- zero per-request overhead
  • Configurable timeouts and connection pooling
  • Zero-copy streaming with SSE and AWS EventStream support
  • API keys wrapped in secure memory, zeroed on drop

Provider Routing

Route to 142+ providers using the provider/model prefix convention:

openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
groq/llama-3.1-70b-versatile
mistral/mistral-large-latest

See the provider registry for the full list.

Documentation

Part of kreuzberg.dev.

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

Join our Discord community for questions and discussion.

License

MIT -- see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

liter_llm-1.0.0rc5.tar.gz (211.3 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

liter_llm-1.0.0rc5-cp310-abi3-win_amd64.whl (3.5 MB view details)

Uploaded CPython 3.10+Windows x86-64

liter_llm-1.0.0rc5-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.5 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.17+ x86-64

liter_llm-1.0.0rc5-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.2 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.17+ ARM64

liter_llm-1.0.0rc5-cp310-abi3-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.10+macOS 11.0+ ARM64

File details

Details for the file liter_llm-1.0.0rc5.tar.gz.

File metadata

  • Download URL: liter_llm-1.0.0rc5.tar.gz
  • Upload date:
  • Size: 211.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for liter_llm-1.0.0rc5.tar.gz
Algorithm Hash digest
SHA256 65ecf4f2092d41422279976214f5532aa9e008171c13c5934e7a4ffb86e37b58
MD5 52ed6d8f5b25b4d351a20ad3c522404f
BLAKE2b-256 4b832e24d9317afa272edba6dfcda216e16a80228886ad53dc0862effaca4bf1

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc5.tar.gz:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc5-cp310-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc5-cp310-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 62debab05dde1b5a8c15172ea603d615fa875400cd9596462b231c008761131a
MD5 06efaf9e0c9c1f9e21775a057a76bcb4
BLAKE2b-256 a81a4421a603f2559188bdf0cea2f79ce2caa7da3ecd7d73fcf369ee70d8a704

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc5-cp310-abi3-win_amd64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc5-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc5-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 0e03145a7ebd6cc103fc2b866e2e978961b65adb213e7ff52b297b90b2121a4e
MD5 2749cf11cc60b71a82a15cae62e30b15
BLAKE2b-256 7e247784015476d49e81a05f647a2e6536b44fddeaa7a1eaeeb07e771543fd8e

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc5-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc5-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc5-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 9e2a91f15d5f4660abb2c011057f3e802c7c0d19057f0c47f3ed43eb8d406682
MD5 51ff0a36135148bc53d73896694591cc
BLAKE2b-256 650e9a099039f58a142166a9edec39018228cfb209135c23942469054382351c

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc5-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc5-cp310-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc5-cp310-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 f227742e6fc10478d1cfd9094f84b8ecc02e81616492d4cf65811ae16196b3c0
MD5 441e8260996813eb4860f97f141cb370
BLAKE2b-256 bf2dbd6a3cd5a4649160d1d6ee36a08003665b19c62b0f0b690b5f44e046fee8

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc5-cp310-abi3-macosx_11_0_arm64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page