Skip to main content

Bridge OpenAI tools to Claude Code SDK, Codex CLI, and OpenRouter

Project description

agentbridge

🌉 Bridge OpenAI tools to Claude Code SDK, Codex CLI, or OpenRouter — use your subscriptions anywhere 🔌

agentbridge is a local OpenAI-compatible API server with provider adapters for Claude Code SDK, Codex CLI, and OpenRouter. It lets tools that already speak the OpenAI Chat Completions API use your active local logins or OpenRouter API key through http://localhost:8082/api/v1.

It supports non-streaming and streaming chat completions, multimodal image/PDF inputs for Claude, image inputs for Codex, namespaced provider model IDs, a small live dashboard, and JSON session logs for debugging.

Legal notice: agentbridge can use Claude Code SDK and Codex CLI access through your local subscriptions, and can also forward requests to OpenRouter when configured. Whether this is allowed under the relevant service terms is your responsibility to evaluate. Use it conservatively and at your own risk.

Install

uv tool install agentbridge-py
claude login
codex login
agentbridge

Open http://localhost:8082/dashboard, use the chat tester at http://localhost:8082/dashboard/chat, or point an OpenAI-compatible client at http://localhost:8082/api/v1.

Install from source when working on the repo:

git clone https://github.com/tsilva/agentbridge.git
cd agentbridge
uv pip install -e .
agentbridge

Usage

curl http://localhost:8082/api/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model": "claudecode/sonnet", "messages": [{"role": "user", "content": "Hello!"}]}'
curl http://localhost:8082/api/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model": "codex/gpt-5.5", "reasoning_effort": "high", "messages": [{"role": "user", "content": "Hello!"}]}'
OPENROUTER_API_KEY=sk-or-... agentbridge
curl http://localhost:8082/api/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model": "openrouter/anthropic/claude-sonnet-4", "messages": [{"role": "user", "content": "Hello!"}]}'
from openai import OpenAI

client = OpenAI(
    base_url="http://localhost:8082/api/v1",
    api_key="not-needed",
)

response = client.chat.completions.create(
    model="claudecode/anthropic/claude-sonnet-4",
    messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)

response = client.chat.completions.create(
    model="codex/gpt-5.5",
    reasoning_effort="high",
    messages=[{"role": "user", "content": "Hello from Codex!"}],
)
print(response.choices[0].message.content)

response = client.chat.completions.create(
    model="openrouter/anthropic/claude-sonnet-4",
    messages=[{"role": "user", "content": "Hello from OpenRouter!"}],
)
print(response.choices[0].message.content)

Commands

agentbridge                         # start on http://127.0.0.1:8082
agentbridge --port 8083             # choose another port
agentbridge -w 3                    # run with three pooled workers
agentbridge --version               # print package and git version
uv tool install . --force --no-cache # reinstall local source build
uv run --extra test pytest -m unit   # run unit tests
uv run --extra test pytest           # run full tests; integration tests expect a server

Publishing

Publishing to PyPI runs through the Release GitHub Actions workflow and targets the agentbridge-py project. Configure PyPI Trusted Publishing for:

  • Owner: tsilva
  • Repository: agentbridge
  • Workflow: release.yml
  • Environment: pypi

No PyPI API token is required. The workflow builds the package, verifies the built project metadata is agentbridge-py, creates the GitHub release, then publishes dist/ to PyPI.

Notes

  • Requires Python 3.12+ and at least one authenticated backend: claude login for Claude Code models, codex login for Codex models, or OPENROUTER_API_KEY for OpenRouter models.
  • Endpoints: POST /api/v1/chat/completions, GET /api/v1/models, GET /health, /dashboard, and /dashboard/chat.
  • Model IDs must be prefixed with an AgentBridge provider namespace: claudecode/<model>, codex/<model>, or openrouter/<provider>/<model>.
  • Claude Code model inputs are claudecode/opus, claudecode/sonnet, claudecode/haiku, or namespaced slugs containing those names, such as claudecode/anthropic/claude-sonnet-4.
  • Codex model inputs are codex/<model>. The requested model is passed directly to Codex CLI; codex/gpt-5.5 defaults to reasoning_effort="high" unless the request sets another effort.
  • OpenRouter model inputs are openrouter/<provider>/<model>, for example openrouter/anthropic/claude-sonnet-4. The upstream model ID after openrouter/ is passed to OpenRouter.
  • PORT, POOL_SIZE, CLAUDE_TIMEOUT, CODEX_TIMEOUT, OPENROUTER_TIMEOUT, OPENROUTER_API_KEY, LOG_DIR, and MAX_LOG_FILES control local runtime behavior.
  • Each Claude request gets a fresh or pre-warmed Claude SDK client and the client is destroyed after use. Each Codex request runs an ephemeral codex exec process in an isolated temporary directory.
  • Claude Code tools are disabled for SDK sessions. Codex runs with read-only sandboxing, no approvals, ephemeral sessions, and ignored project rules. OpenAI-style function calling is emulated by prompting for JSON tool-call output.
  • Session logs are written as JSON under logs/sessions by default; base64 image and PDF attachments are saved beside their request logs.

Architecture

agentbridge architecture diagram

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentbridge_py-0.1.8.tar.gz (10.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentbridge_py-0.1.8-py3-none-any.whl (54.5 kB view details)

Uploaded Python 3

File details

Details for the file agentbridge_py-0.1.8.tar.gz.

File metadata

  • Download URL: agentbridge_py-0.1.8.tar.gz
  • Upload date:
  • Size: 10.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for agentbridge_py-0.1.8.tar.gz
Algorithm Hash digest
SHA256 24a69a8e5ee1dca342ca4cfa277bb469fd0f03a3d61eb9dd66c5c671cb0b3d19
MD5 ca00b313a32214ca3f1979a1a5917de5
BLAKE2b-256 a6d61d9dc6468e6d93c651966f2e1cfb014eb0d9b1af76820b2366d989bc0659

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentbridge_py-0.1.8.tar.gz:

Publisher: release.yml on tsilva/agentbridge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agentbridge_py-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: agentbridge_py-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 54.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for agentbridge_py-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 7c48b63e9af1e9ee61d849188eb75b18c10f0e4dbeb3c865e7886c7d49dbf188
MD5 d10bb92baeaefe17a1c2c99a79e11901
BLAKE2b-256 966016f36320c2f9568f3a22fb612ec37695046f36f676140133aa1e1740daf4

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentbridge_py-0.1.8-py3-none-any.whl:

Publisher: release.yml on tsilva/agentbridge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page