LLM-Rosetta: Unified LLM API adapter — bidirectional conversion between OpenAI, Anthropic, and Google formats via intermediate representation (IR).
Project description
LLM-Rosetta
LLM-Rosetta — A Python library for converting between different LLM provider API formats using a hub-and-spoke architecture with a central IR (Intermediate Representation).
Full Documentation
Full documentation is available at:
- English: https://llm-rosetta.readthedocs.io/en/latest/
- 中文: https://llm-rosetta.readthedocs.io/zh-cn/latest/
The Problem
When building applications that work with multiple LLM providers, you face an N² conversion problem — every provider pair requires its own conversion logic. LLM-Rosetta solves this with a hub-and-spoke approach: each provider only needs a single converter to/from the shared IR format.
Provider A ──→ IR ──→ Provider B
Provider C ──→ IR ──→ Provider D
... and so on
Supported Providers
| Provider | API Standard | Request | Response | Streaming |
|---|---|---|---|---|
| OpenAI | Chat Completions | ✅ | ✅ | ✅ |
| OpenAI | Responses API | ✅ | ✅ | ✅ |
| Anthropic | Messages API | ✅ | ✅ | ✅ |
| GenAI API | ✅ | ✅ | ✅ |
Ollama & Other OpenAI-Compatible Servers
LLM-Rosetta works out of the box with any server that exposes OpenAI-compatible endpoints. Ollama (v0.13+) is a great example — it supports three of the four API formats that LLM-Rosetta converts between:
| Ollama Endpoint | LLM-Rosetta Converter | Since |
|---|---|---|
/v1/chat/completions |
openai_chat |
Early versions |
/v1/responses |
openai_responses |
v0.13.3 |
/v1/messages |
anthropic |
v0.14.0 |
Other compatible servers include HuggingFace TGI, vLLM, and LM Studio.
Features
- Unified IR format for messages, tool calls, and content parts
- Bidirectional conversion: requests to provider format, responses from provider format
- Streaming support with typed stream events
- Auto-detection of provider from request/response objects
- Support for text, images, tool calls, and tool results
- Zero required dependencies (only
typing_extensions); provider SDKs are optional
Installation
Basic Installation
Install the core package (requires Python >= 3.8):
pip install llm-rosetta
Installing with Provider SDKs
# Individual providers
pip install llm-rosetta[openai]
pip install llm-rosetta[anthropic]
pip install llm-rosetta[google]
# All providers
pip install llm-rosetta[openai,anthropic,google]
Optional Dependencies
| Extra | Packages | Description |
|---|---|---|
openai |
openai |
OpenAI Chat Completions & Responses API |
anthropic |
anthropic |
Anthropic Messages API |
google |
google-genai |
Google GenAI API |
Quick Start
from llm_rosetta import OpenAIChatConverter, AnthropicConverter
# Create converters
openai_conv = OpenAIChatConverter()
anthropic_conv = AnthropicConverter()
# Convert an OpenAI response to IR, then to Anthropic format
ir_messages = openai_conv.response_from_provider(openai_response)
anthropic_request = anthropic_conv.request_to_provider(ir_messages)
Auto-Detection
from llm_rosetta import convert, detect_provider
# Automatically detect provider and convert
provider = detect_provider(some_response)
ir_messages = convert(some_response, direction="from_provider")
Cross-Provider Conversation
from llm_rosetta import OpenAIChatConverter, GoogleGenAIConverter
from llm_rosetta.types.ir import Message, ContentPart
# Shared IR message history
ir_messages = []
# Turn 1: Ask OpenAI
ir_messages.append(Message(role="user", content=[ContentPart(type="text", text="Hello!")]))
openai_request = openai_conv.request_to_provider({"messages": ir_messages})
openai_response = openai_client.chat.completions.create(**openai_request)
ir_messages.extend(openai_conv.response_from_provider(openai_response))
# Turn 2: Continue with Google — full context preserved
google_request = google_conv.request_to_provider({"messages": ir_messages})
Contributing
Contributions are welcome! Please visit the GitHub repository to get started.
License
This project is licensed under the MIT License — see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_rosetta-0.4.2.tar.gz.
File metadata
- Download URL: llm_rosetta-0.4.2.tar.gz
- Upload date:
- Size: 182.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
84d873425a20387c37fe1d95e8b986b398b376cb27579bfebda43541a53d317f
|
|
| MD5 |
e546863b0d8cf85ad00e1855f3e629d1
|
|
| BLAKE2b-256 |
8ae7a62c6ff0249cd74bf4dc048beb947a7929647641387946fc5b1e555ba589
|
File details
Details for the file llm_rosetta-0.4.2-py3-none-any.whl.
File metadata
- Download URL: llm_rosetta-0.4.2-py3-none-any.whl
- Upload date:
- Size: 207.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
45dbfc4bf8e584e42f7f62a7a9b9a493483b7a69cffdf93e7ef4b4d0230bc5d8
|
|
| MD5 |
c514ec5da8becdb8255cb44fe19e799d
|
|
| BLAKE2b-256 |
151cf151548738c9d83dfd4223f77195a2d0d4a09ac9ce86f31f964e1883d585
|