Skip to main content

Universal LLM API client for Python. Unified interface for streaming, tool calling, and provider routing across 142+ LLM providers. Rust-powered.

Project description

Python

kreuzberg.dev

Universal LLM API client for Python. Access 142+ LLM providers — OpenAI, Anthropic, Groq, Mistral, and more — through a single unified interface. Native async/await support, streaming responses, tool calling, and type-safe API.

Installation

Package Installation

Install via pip:

pip install liter-llm

System Requirements

  • Python 3.10+ required
  • API keys via environment variables (e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY)

Quick Start

Basic Chat

Send a message to any provider using the provider/model prefix:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Hello!"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())

Common Use Cases

Streaming Responses

Stream tokens in real time:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    async for chunk in await client.chat_stream(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Tell me a story"}],
    ):
        if chunk.choices and chunk.choices[0].delta.content:
            print(chunk.choices[0].delta.content, end="", flush=True)
    print()

asyncio.run(main())

Tool Calling

Define and invoke tools:

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])

    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "Get the current weather for a location",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {"type": "string", "description": "City name"},
                    },
                    "required": ["location"],
                },
            },
        }
    ]

    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "What is the weather in Berlin?"}],
        tools=tools,
    )

    choice = response.choices[0]
    if choice.message.tool_calls:
        for call in choice.message.tool_calls:
            print(f"Tool: {call.function.name}, Args: {call.function.arguments}")

asyncio.run(main())

Next Steps

Features

Supported Providers (142+)

Route to any provider using the provider/model prefix convention:

Provider Example Model
OpenAI openai/gpt-4o, openai/gpt-4o-mini
Anthropic anthropic/claude-3-5-sonnet-20241022
Groq groq/llama-3.1-70b-versatile
Mistral mistral/mistral-large-latest
Cohere cohere/command-r-plus
Together AI together/meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo
Fireworks fireworks/accounts/fireworks/models/llama-v3p1-70b-instruct
Google Vertex vertexai/gemini-1.5-pro
Amazon Bedrock bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0

Complete Provider List

Key Capabilities

  • Provider Routing -- Single client for 142+ LLM providers via provider/model prefix

  • Unified API -- Consistent chat, chat_stream, embeddings, list_models interface

  • Streaming -- Real-time token streaming via chat_stream

  • Tool Calling -- Function calling and tool use across all supporting providers

  • Type Safe -- Schema-driven types compiled from JSON schemas

  • Secure -- API keys never logged or serialized, managed via environment variables

  • Observability -- Built-in OpenTelemetry with GenAI semantic conventions

  • Error Handling -- Structured errors with provider context and retry hints

Performance

Built on a compiled Rust core for speed and safety:

  • Provider resolution at client construction -- zero per-request overhead
  • Configurable timeouts and connection pooling
  • Zero-copy streaming with SSE and AWS EventStream support
  • API keys wrapped in secure memory, zeroed on drop

Provider Routing

Route to 142+ providers using the provider/model prefix convention:

openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
groq/llama-3.1-70b-versatile
mistral/mistral-large-latest

See the provider registry for the full list.

Documentation

Part of kreuzberg.dev.

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

Join our Discord community for questions and discussion.

License

MIT -- see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

liter_llm-1.0.0rc6.tar.gz (220.2 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

liter_llm-1.0.0rc6-cp310-abi3-win_amd64.whl (3.9 MB view details)

Uploaded CPython 3.10+Windows x86-64

liter_llm-1.0.0rc6-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.17+ x86-64

liter_llm-1.0.0rc6-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.17+ ARM64

liter_llm-1.0.0rc6-cp310-abi3-macosx_11_0_arm64.whl (3.6 MB view details)

Uploaded CPython 3.10+macOS 11.0+ ARM64

File details

Details for the file liter_llm-1.0.0rc6.tar.gz.

File metadata

  • Download URL: liter_llm-1.0.0rc6.tar.gz
  • Upload date:
  • Size: 220.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for liter_llm-1.0.0rc6.tar.gz
Algorithm Hash digest
SHA256 d6025242fd54cd2ad99582d9fa06ab32c7351d0c694820561c02857482b47d37
MD5 1ce3fa830120c3005d8835696265b5dc
BLAKE2b-256 44101936d00388afa5d77c9f1ad1a4725678a703a7843c028b694e52dd5fb53f

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc6.tar.gz:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc6-cp310-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc6-cp310-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 892cfe676b7dd46e77960b76d7fcb76c88aca9f256b6e7949755a29bf5cc30a8
MD5 b9dd404a7aa3225dcf2044736d34de1b
BLAKE2b-256 4024a4db7a02aa23075159f40c5cf579bb8f4020418c66949e06ddade0f9a73b

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc6-cp310-abi3-win_amd64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc6-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc6-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 f39eab1e1e18fefb6af591981498ef48c52bc6a3cb4880af7e762ef0e3415ff7
MD5 9bccb9ad8a28a47fb73b152aa4b09485
BLAKE2b-256 eef80e9091eff88d56f629f741cbccbadaaa270bf35962452bfc079ccb14b5f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc6-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc6-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc6-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 8228d24de0465cc71769794c0305985dba7573790565e99502939a93c1f92e42
MD5 47abd050b4dc88fd5a321973b3741c12
BLAKE2b-256 dbb6b5bbed6ab9ceb5027cee7ae71dd6f062f421bfd54d0e117d61a72058cfb9

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc6-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file liter_llm-1.0.0rc6-cp310-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for liter_llm-1.0.0rc6-cp310-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 27da8017b928dc87566f1bfb5bdaf3b28f21631fb6ea9cda3dd3fe7fdfb08571
MD5 e6b7ab16957d52467d1d5842e25dcada
BLAKE2b-256 4a046230b0486fb47d51cadba248bf5cdc6ab69f7e2517d4faef70f27f84af0f

See more details on using hashes here.

Provenance

The following attestation bundles were made for liter_llm-1.0.0rc6-cp310-abi3-macosx_11_0_arm64.whl:

Publisher: publish.yaml on kreuzberg-dev/liter-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page