Skip to main content

LiteLLM provider for Treasure Data LLM Proxy

Project description

LiteLLM TD LLM Proxy Provider

Python PyPI CI status

Custom LiteLLM provider for Treasure Data's LLM Proxy API.

Installation

Using uv (recommended)

cd litellm-td-llm-provider
uv sync
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

Using pip

cd litellm-td-llm-provider
pip install -e .

Usage

Basic Completion

import litellm
from litellm_td_llm_provider import register_td_provider

# Register the TD provider
register_td_provider()

# Use with LiteLLM
response = litellm.completion(
    model="td/claude-sonnet-4-5",
    messages=[{"role": "user", "content": "Hello!"}],
    api_key="your-td-api-key",
    api_base="https://llm-proxy.us01.treasuredata.com"
)

Async Completion

import asyncio
import litellm
from litellm_td_llm_provider import register_td_provider

register_td_provider()

async def main():
    response = await litellm.acompletion(
        model="td/claude-sonnet-4-5",
        messages=[{"role": "user", "content": "Hello!"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())

Tool Calling

import litellm
from litellm_td_llm_provider import register_td_provider

register_td_provider()

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the current weather",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {"type": "string", "description": "City name"},
                },
                "required": ["location"],
            },
        },
    }
]

response = litellm.completion(
    model="td/claude-sonnet-4-5",
    messages=[{"role": "user", "content": "What's the weather in Tokyo?"}],
    tools=tools,
)

if response.choices[0].message.tool_calls:
    for tool_call in response.choices[0].message.tool_calls:
        print(f"Tool: {tool_call.function.name}")
        print(f"Arguments: {tool_call.function.arguments}")

Configuration

Available Models

The following Claude 4.5 models are available through TD LLM Proxy:

Model Model ID Alias
Claude Sonnet 4.5 claude-sonnet-4-5-20250929 claude-sonnet-4-5
Claude Haiku 4.5 claude-haiku-4-5-20251001 claude-haiku-4-5
Claude Opus 4.5 claude-opus-4-5-20251101 claude-opus-4-5

Note:

  • Aliases (e.g., claude-sonnet-4-5) automatically point to the latest version
  • Use specific versions (e.g., claude-sonnet-4-5-20250929) if you need to pin to a particular model snapshot

This provider uses claude-sonnet-4-5 as the default model.

Site Endpoints

Environment Variables

export TD_LLM_API_KEY="your-api-key"
export TD_LLM_SITE="us01"  # Optional, defaults to us01

Features

  • ✅ Streaming support
  • ✅ Non-streaming completion
  • ✅ Async completion (acompletion)
  • ✅ Tool calling support (OpenAI → Anthropic format conversion)
  • ✅ Multi-site support
  • ✅ Anthropic Messages API compatible
  • ✅ OpenAI-compatible interface

Development

Setup

# Install dependencies (creates venv automatically)
uv sync

# Activate virtual environment
source .venv/bin/activate

# Run tests
uv run pytest

Using Makefile

make install-dev  # Install dev dependencies
make format       # Format code with ruff
make lint         # Run linting
make check        # Run type checking with pyright
make test         # Run tests
make test-cov     # Run tests with coverage
make check-all    # Run all checks (format, lint, check, test)
make clean        # Clean cache and build artifacts

Release Process

  1. Update version in pyproject.toml
  2. Create and push a git tag:
    git tag v0.1.0
    git push origin v0.1.0
    
  3. GitHub Actions will automatically build and publish to PyPI

Required GitHub Secret:

  • PYPI_TOKEN: PyPI API token for publishing

Running Examples

# Set credentials
export TD_LLM_API_KEY="your-api-key"
export TD_LLM_SITE="us01"

# Run basic example
python examples/basic_usage.py

# Run site selection example
python examples/site_selection.py

Implementation Details

This provider is based on the TypeScript implementation in tdx-studio/electron/services/llm-proxy-client.ts and provides:

  1. Endpoint Management: Multi-region support with automatic site resolution
  2. Streaming: Server-Sent Events (SSE) parsing for real-time responses
  3. Error Handling: Proper error propagation and API error messages
  4. LiteLLM Integration: Custom provider registration with standard LiteLLM interface

API Compatibility

The TD LLM Proxy API is Anthropic-compatible and supports:

  • /v1/messages endpoint
  • x-api-key authentication
  • anthropic-version: 2023-06-01 header
  • Streaming via SSE with stream: true

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

litellm_td_llm_provider-0.1.1.tar.gz (12.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

litellm_td_llm_provider-0.1.1-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file litellm_td_llm_provider-0.1.1.tar.gz.

File metadata

  • Download URL: litellm_td_llm_provider-0.1.1.tar.gz
  • Upload date:
  • Size: 12.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.7 {"installer":{"name":"uv","version":"0.10.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for litellm_td_llm_provider-0.1.1.tar.gz
Algorithm Hash digest
SHA256 8bef45236e46391a499b8a2024cfbd83dd9e8d13c4883c9d416df2bf9f8edd1c
MD5 5d3f4110aeaa99997d60028fa0166453
BLAKE2b-256 50dbd5da7eca7b2c4e1958f6c7d9d33435ce73b3f7ac5df11d270b60822092e5

See more details on using hashes here.

File details

Details for the file litellm_td_llm_provider-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: litellm_td_llm_provider-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 8.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.7 {"installer":{"name":"uv","version":"0.10.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for litellm_td_llm_provider-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 25f65171a9230740a271c3b6e19362cd82a51c981c3bef8419ab336cc0859394
MD5 221b0541dda01bb90c0eabd9b1e2fce6
BLAKE2b-256 3b174f1348bccf64f1432903133291be70b19855e3f167e212d6860dce9e431d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page