Skip to main content

Anthropic to OpenAI API Wrapper - Use Claude models with OpenAI-compatible interface

Project description

🔄 anth2oai

Anthropic to OpenAI API Wrapper

Use Anthropic's Claude models with OpenAI-compatible API interface


Python License

Overview

anth2oai is a lightweight wrapper that allows you to use Anthropic's Claude models through an OpenAI-compatible API interface. This makes it easy to switch between OpenAI and Anthropic models in your existing codebase with minimal changes.

Features

  • OpenAI-compatible interface - Drop-in replacement for OpenAI client
  • Async & Sync support - Both AsyncAnth2OAI and Anth2OAI clients available
  • Streaming support - Full support for streaming responses
  • Tool/Function calling - Automatic conversion of OpenAI tools format to Anthropic format
  • System prompts - Automatic handling of system messages
  • Custom base URL - Support for Anthropic API proxies

Installation

pip install anth2oai

Or install from source:

git clone https://github.com/your-repo/anth2oai.git
cd anth2oai
pip install -e .

Quick Start

Async Client

import asyncio
from anth2oai import AsyncAnth2OAI

async def main():
    client = AsyncAnth2OAI(
        api_key="your-anthropic-api-key",
        # base_url="https://api.anthropic.com"  # Optional: custom endpoint
    )

    # Non-streaming
    response = await client.chat.completions.create(
        messages=[
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Hello, who are you?"}
        ],
        model="claude-sonnet-4-5-20250929",
    )
    print(response.choices[0].message.content)

    # Streaming
    stream = await client.chat.completions.create(
        messages=[{"role": "user", "content": "Tell me a joke"}],
        model="claude-sonnet-4-5-20250929",
        stream=True,
    )
    async for chunk in stream:
        if chunk.choices[0].delta.content:
            print(chunk.choices[0].delta.content, end="", flush=True)

asyncio.run(main())

Sync Client

from anth2oai import Anth2OAI

client = Anth2OAI(
    api_key="your-anthropic-api-key",
)

# Non-streaming
response = client.chat.completions.create(
    messages=[{"role": "user", "content": "Hello!"}],
    model="claude-sonnet-4-5-20250929",
)
print(response.choices[0].message.content)

# Streaming
for chunk in client.chat.completions.create(
    messages=[{"role": "user", "content": "Count to 5"}],
    model="claude-sonnet-4-5-20250929",
    stream=True,
):
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="", flush=True)

Tool/Function Calling

The wrapper automatically converts OpenAI's tool format to Anthropic's format:

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the current weather for a location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "City name, e.g., San Francisco",
                    },
                },
                "required": ["location"],
            },
        },
    },
]

response = await client.chat.completions.create(
    messages=[{"role": "user", "content": "What's the weather in Tokyo?"}],
    model="claude-sonnet-4-5-20250929",
    tools=tools,
)

# Check for tool calls
if response.choices[0].message.tool_calls:
    for tool_call in response.choices[0].message.tool_calls:
        print(f"Function: {tool_call.function.name}")
        print(f"Arguments: {tool_call.function.arguments}")

Configuration

Environment Variables

You can configure the client using environment variables:

export OPENAI_API_KEY="your-anthropic-api-key"
export OPENAI_BASE_URL="https://api.anthropic.com"  # Optional

Then simply:

client = AsyncAnth2OAI()  # Will use env variables

Available Parameters

Parameter Description Default
api_key Anthropic API key OPENAI_API_KEY env var
base_url API endpoint URL https://api.anthropic.com
max_tokens Maximum tokens in response 1024
timeout Request timeout in seconds None

Response Format

Responses are returned in OpenAI's ChatCompletion format:

ChatCompletion(
    id='chatcmpl-xxx',
    choices=[
        Choice(
            finish_reason='stop',
            index=0,
            message=ChatCompletionMessage(
                content='Hello! How can I help you today?',
                role='assistant',
                tool_calls=None,
            ),
        )
    ],
    created=1234567890,
    model='claude-sonnet-4-5-20250929',
    object='chat.completion',
    usage=CompletionUsage(
        completion_tokens=10,
        prompt_tokens=5,
        total_tokens=15,
    ),
)

Supported Models

Any Anthropic model can be used. Common models include:

  • claude-sonnet-4-5-20250929 (Claude 3.5 Sonnet)
  • claude-3-opus-20240229 (Claude 3 Opus)
  • claude-3-haiku-20240307 (Claude 3 Haiku)

License

MIT License - see LICENSE for details.


Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Acknowledgements

This project is inspired by the need to easily switch between OpenAI and Anthropic APIs in production applications.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

anth2oai-0.1.0.tar.gz (8.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

anth2oai-0.1.0-py3-none-any.whl (9.2 kB view details)

Uploaded Python 3

File details

Details for the file anth2oai-0.1.0.tar.gz.

File metadata

  • Download URL: anth2oai-0.1.0.tar.gz
  • Upload date:
  • Size: 8.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.17 {"installer":{"name":"uv","version":"0.9.17","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for anth2oai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 e2ce6f02ebc2811618947c4c5e2b10a4c7998528b757e96331bc83e2334d34bd
MD5 7890f86ab38aa7166e22eb3105b72e8a
BLAKE2b-256 ceaeae414b0322fa041324824563a3ab4a205686091c376b93a1f9e351246922

See more details on using hashes here.

File details

Details for the file anth2oai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: anth2oai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 9.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.17 {"installer":{"name":"uv","version":"0.9.17","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for anth2oai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 79dbdbe0026f16f5e010784dbb44d0dc23023064b73ffa157b4bcd076f6f98d1
MD5 2425dda1b513212836b21eb53f277003
BLAKE2b-256 da49a42357f1f0b36104a00748d06ddf569e3fb967b2c0c34b7cc7c18d9fbed2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page