Skip to main content

LLM-Rosetta: A library to convert between different LLM provider API standards using intermediate representation (IR).

Project description

LLM-Rosetta

PyPI version GitHub release CI License: MIT Ask DeepWiki

English Version | 中文版

LLM-Rosetta — A Python library for converting between different LLM provider API formats using a hub-and-spoke architecture with a central IR (Intermediate Representation).

Full Documentation

Full documentation is available at:

The Problem

When building applications that work with multiple LLM providers, you face an N² conversion problem — every provider pair requires its own conversion logic. LLM-Rosetta solves this with a hub-and-spoke approach: each provider only needs a single converter to/from the shared IR format.

Provider A ──→ IR ──→ Provider B
Provider C ──→ IR ──→ Provider D
         ... and so on

Supported Providers

Provider API Standard Request Response Streaming
OpenAI Chat Completions
OpenAI Responses API
Anthropic Messages API
Google GenAI API

Ollama & Other OpenAI-Compatible Servers

LLM-Rosetta works out of the box with any server that exposes OpenAI-compatible endpoints. Ollama (v0.13+) is a great example — it supports three of the four API formats that LLM-Rosetta converts between:

Ollama Endpoint LLM-Rosetta Converter Since
/v1/chat/completions openai_chat Early versions
/v1/responses openai_responses v0.13.3
/v1/messages anthropic v0.14.0

Other compatible servers include HuggingFace TGI, vLLM, and LM Studio.

Features

  • Unified IR format for messages, tool calls, and content parts
  • Bidirectional conversion: requests to provider format, responses from provider format
  • Streaming support with typed stream events
  • Auto-detection of provider from request/response objects
  • Support for text, images, tool calls, and tool results
  • Zero required dependencies (only typing_extensions); provider SDKs are optional

Installation

Basic Installation

Install the core package (requires Python >= 3.8):

pip install llm-rosetta

Installing with Provider SDKs

# Individual providers
pip install llm-rosetta[openai]
pip install llm-rosetta[anthropic]
pip install llm-rosetta[google]

# All providers
pip install llm-rosetta[openai,anthropic,google]

Optional Dependencies

Extra Packages Description
openai openai OpenAI Chat Completions & Responses API
anthropic anthropic Anthropic Messages API
google google-genai Google GenAI API

Quick Start

from llm_rosetta import OpenAIChatConverter, AnthropicConverter

# Create converters
openai_conv = OpenAIChatConverter()
anthropic_conv = AnthropicConverter()

# Convert an OpenAI response to IR, then to Anthropic format
ir_messages = openai_conv.response_from_provider(openai_response)
anthropic_request = anthropic_conv.request_to_provider(ir_messages)

Auto-Detection

from llm_rosetta import convert, detect_provider

# Automatically detect provider and convert
provider = detect_provider(some_response)
ir_messages = convert(some_response, direction="from_provider")

Cross-Provider Conversation

from llm_rosetta import OpenAIChatConverter, GoogleGenAIConverter
from llm_rosetta.types.ir import Message, ContentPart

# Shared IR message history
ir_messages = []

# Turn 1: Ask OpenAI
ir_messages.append(Message(role="user", content=[ContentPart(type="text", text="Hello!")]))
openai_request = openai_conv.request_to_provider({"messages": ir_messages})
openai_response = openai_client.chat.completions.create(**openai_request)
ir_messages.extend(openai_conv.response_from_provider(openai_response))

# Turn 2: Continue with Google — full context preserved
google_request = google_conv.request_to_provider({"messages": ir_messages})

Related Projects

LLM-Rosetta is part of the ToolRegistry ecosystem:

Contributing

Contributions are welcome! Please visit the GitHub repository to get started.

License

This project is licensed under the MIT License — see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_rosetta-0.2.3.tar.gz (150.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_rosetta-0.2.3-py3-none-any.whl (170.1 kB view details)

Uploaded Python 3

File details

Details for the file llm_rosetta-0.2.3.tar.gz.

File metadata

  • Download URL: llm_rosetta-0.2.3.tar.gz
  • Upload date:
  • Size: 150.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for llm_rosetta-0.2.3.tar.gz
Algorithm Hash digest
SHA256 01ba18ba76835fc95597f0110edbd4f1abf06bd137876a18c4053f4f88e44e9e
MD5 fb421e280b6716bae422526a998968bd
BLAKE2b-256 a119b80acf6931d0622ebe8250204c50d8fba3a7df2406bfc480e86a0f24e2df

See more details on using hashes here.

File details

Details for the file llm_rosetta-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: llm_rosetta-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 170.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for llm_rosetta-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 0b62f297cb8199dd9e59a0640504168f573748b632aaa94bed4ec120bd214dd3
MD5 01495d54548dd337df4fcd092d410eeb
BLAKE2b-256 55fdb45a772bfc668fe1c74af3c368556d2d1ddf038e880625324cd6c97bbb57

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page