Skip to main content

Bridge OpenAI tools to Claude Code SDK

Project description

claudebridge

claudebridge

CI PyPI Python 3.12+ License: MIT GitHub stars

🌉 Bridge OpenAI tools to Claude Code SDK — use your subscription anywhere 🔌

⚠️ Legal Notice: This tool bridges OpenAI-compatible clients to Claude using the Claude Code SDK. The permissibility of this usage under Anthropic's Terms of Service is unclear. Use at your own peril. Please review the Legal Disclaimer section before use.

Why claudebridge?

You have a Claude subscription. You have tools that speak OpenAI's API. claudebridge connects them — no API keys, no extra costs.

claudebridge LiteLLM Direct API
Uses your Claude subscription Yes No No
No API key needed Yes No No
One command to start Yes ~Yes No
OpenAI-compatible Yes Yes No

Features

  • Lightweight — Minimal dependencies, easy to understand
  • OpenAI-compatible — Drop-in replacement for /api/v1/chat/completions
  • Uses your subscription — No API keys needed, uses Claude Code OAuth
  • Streaming support — Real-time SSE responses matching OpenAI format
  • Connection pooling — Pre-spawned clients for reduced latency
  • Session logging — Full request/response logging for debugging

Quick Start

# Install globally
uv tool install git+https://github.com/tsilva/claudebridge

# Or install from source
git clone https://github.com/tsilva/claudebridge
cd claudebridge
uv pip install -e .

# Run the server
claudebridge

# Verify installed version
claudebridge --version

Local Development

When reinstalling from source, use --no-cache to ensure you get the latest code:

uv tool install . --force --no-cache

Server starts at http://localhost:8082

Usage

With curl

# Non-streaming
curl -X POST http://localhost:8082/api/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model": "sonnet", "messages": [{"role": "user", "content": "Hello!"}]}'

# Streaming
curl -X POST http://localhost:8082/api/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model": "sonnet", "messages": [{"role": "user", "content": "Hello!"}], "stream": true}'

With OpenAI Python Client

from openai import OpenAI

client = OpenAI(
    base_url="http://localhost:8082/api/v1",
    api_key="not-needed"
)

response = client.chat.completions.create(
    model="sonnet",
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)

Streaming

stream = client.chat.completions.create(
    model="sonnet",
    messages=[{"role": "user", "content": "Tell me a story"}],
    stream=True
)
for chunk in stream:
    print(chunk.choices[0].delta.content or "", end="")

CLI Client

The CLI client can be used for ad-hoc testing:

# Direct prompt
python -m claudebridge.client "What is Python?"

# Pipe from stdin
echo "Hello" | python -m claudebridge.client

# Use different model
python -m claudebridge.client --model opus "Explain decorators"

# Non-streaming mode
python -m claudebridge.client --no-stream "Quick answer"

# Multiple parallel requests
python -m claudebridge.client -n 3 "Hello"

BridgeClient Library

Use BridgeClient programmatically for testing or integration:

from claudebridge.client import BridgeClient

# Sync usage
with BridgeClient() as client:
    if client.health_check():
        models = client.list_models()
        response = client.complete_sync("Hello!", stream=False)
        print(response)

# Async usage
import asyncio

async def main():
    async with BridgeClient() as client:
        response = await client.complete("Hello!")
        print(response)

        # Or stream chunks
        async for chunk in client.stream("Tell me a story"):
            print(chunk, end="")

asyncio.run(main())

Testing

# Install test dependencies
uv pip install -e ".[test]"

# Run the test suite (requires server running)
uv run pytest tests/test_client.py -v
Compatible Tools
  • Cursor — Use Claude through Cursor's OpenAI-compatible backend
  • Continue.dev — VS Code extension with OpenAI endpoint support
  • Open WebUI — Self-hosted ChatGPT-like interface
  • LangChain / LlamaIndex — Via OpenAI provider
  • Any OpenAI SDK client — Python, TypeScript, Go, etc.

Available Models

Model ID Description
opus Claude Opus (most capable)
sonnet Claude Sonnet (balanced)
haiku Claude Haiku (fastest)

Also accepts: claude-opus, claude-sonnet, claude-haiku, claude-3-sonnet, claude-3.5-sonnet, etc.

API Endpoints

Endpoint Method Description
/api/v1/chat/completions POST Chat completions (OpenAI format)
/api/v1/models GET List available models
/health GET Health check

Configuration

The bridge uses your existing Claude Code authentication:

claude login

Environment Variables

Variable Default Description
PORT 8082 Server port
POOL_SIZE 3 Number of workers (also settable via --workers/-w flag)
CLAUDE_TIMEOUT 120 Request timeout in seconds

Client Environment Variables

Variable Default Description
BRIDGE_URL http://localhost:8082 API base URL
OPENROUTER_API_KEY - API key (if needed)
OPENROUTER_MODEL default Default model

Architecture

claudebridge/
├── server.py         # FastAPI app, endpoints, Claude SDK integration
├── pool.py           # Client pool for connection reuse
├── models.py         # Pydantic models for OpenAI request/response format
├── client.py         # BridgeClient library + CLI
└── session_logger.py # Request/response logging

Requirements

  • Python 3.10+
  • Active Claude Code subscription
  • Claude Code CLI authenticated (claude login)

Legal Disclaimer

⚠️ Use at Your Own Peril

This tool (claudebridge) creates a bridge between OpenAI-compatible clients and Claude using the Claude Code SDK. The permissibility of this usage under Anthropic's Terms of Service is not clearly defined and subject to interpretation.

The Ambiguity

Anthropic's terms contain provisions that may affect this usage:

  • Commercial Terms: Prohibit building competing products, reverse engineering, or reselling services
  • Consumer Terms: Restrict automated/non-human access except via approved APIs
  • Both: Prohibit developing competing products or training competing AI models

This tool:

  • Uses the Claude Code SDK (not the official Anthropic API)
  • Enables programmatic access to Claude through your existing subscription
  • Could be interpreted as "automated access" or a "competing product"

Our Interpretation

We believe that for lightweight personal local usage, this should fall within acceptable use since:

  • It requires an active Claude subscription
  • It uses the official Claude Code SDK
  • It's for personal/local development purposes only
  • It doesn't compete with or replace Anthropic's services

However, this interpretation is disputable and may not align with Anthropic's view.

Guidelines for Conservative Use

Even if you conclude this usage is legitimate, act conservatively:

  • For yourself only — Do not share access, create multi-user services, or allow others to use your instance
  • Stay local — Run only on your personal machine, not on servers or cloud infrastructure
  • Minimal usage — Use sparingly and only for genuine personal development needs
  • No automation — Do not build automated pipelines, bots, or services on top of this tool
  • No redistribution — Do not package or distribute this as part of other products
  • Stay within boundaries — Respect rate limits and avoid any behavior that could be seen as abuse

Your Responsibility

By using this tool, you acknowledge that:

  1. You have read and understand Anthropic's Terms of Service
  2. You accept that this usage may violate those terms
  3. You use this tool entirely at your own peril
  4. You are solely responsible for any consequences of use
  5. We assume no liability for your use of this tool

We strongly encourage you to:

  • Review Anthropic's terms yourself
  • Make your own determination about permissibility
  • Err on the side of caution
  • Contact Anthropic if you need clarification
  • Discontinue use if you have any concerns

This project is provided as-is for educational purposes only.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

py_claudebridge-0.1.7.tar.gz (1.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

py_claudebridge-0.1.7-py3-none-any.whl (24.5 kB view details)

Uploaded Python 3

File details

Details for the file py_claudebridge-0.1.7.tar.gz.

File metadata

  • Download URL: py_claudebridge-0.1.7.tar.gz
  • Upload date:
  • Size: 1.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for py_claudebridge-0.1.7.tar.gz
Algorithm Hash digest
SHA256 aad4831bd14eae060c279900e8fa28d2d4d673a0c80d67d5c5365da8236a41e1
MD5 4f980371883dcce6a133aa6ce3804db5
BLAKE2b-256 abcef52bdf1f53140b24a432c6f40f40769e29a253dd48ff5b6a43a909cbbdf9

See more details on using hashes here.

Provenance

The following attestation bundles were made for py_claudebridge-0.1.7.tar.gz:

Publisher: release.yml on tsilva/claudebridge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file py_claudebridge-0.1.7-py3-none-any.whl.

File metadata

File hashes

Hashes for py_claudebridge-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 be94bf2e36770a0e168ee16061db2a14ffaa04a8cc6f0b6e6da710bbe1719f73
MD5 8311a403f9602c8dae07082196d958dd
BLAKE2b-256 4ee2c0cb1bf13b691917a2988f58943ebe53b25fce79a02b33a490f1de2b4677

See more details on using hashes here.

Provenance

The following attestation bundles were made for py_claudebridge-0.1.7-py3-none-any.whl:

Publisher: release.yml on tsilva/claudebridge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page