Skip to main content

Codex load balancer and proxy for ChatGPT accounts with usage dashboard

Project description

codex-lb

Load balancer for ChatGPT accounts. Pool multiple accounts, track usage, manage API keys, view everything in a dashboard.

dashboard accounts
More screenshots
Settings Login
settings login
Dashboard (dark) Accounts (dark) Settings (dark)
dashboard-dark accounts-dark settings-dark

Features

Account Pooling
Load balance across multiple ChatGPT accounts
Usage Tracking
Per-account tokens, cost, 28-day trends
API Keys
Per-key rate limits by token, cost, window, model
Dashboard Auth
Password + optional TOTP
OpenAI-compatible
Codex CLI, OpenCode, any OpenAI client
Auto Model Sync
Available models fetched from upstream

Quick Start

# Docker (recommended)
docker volume create codex-lb-data
docker run -d --name codex-lb \
  -p 2455:2455 -p 1455:1455 \
  -v codex-lb-data:/var/lib/codex-lb \
  ghcr.io/soju06/codex-lb:latest

# or uvx
uvx codex-lb

Open localhost:2455 → Add account → Done.

Client Setup

Point any OpenAI-compatible client at codex-lb. If API key auth is enabled, pass a key from the dashboard as a Bearer token.

Logo Client Endpoint Config
OpenAI Codex CLI http://127.0.0.1:2455/backend-api/codex ~/.codex/config.toml
OpenCode OpenCode http://127.0.0.1:2455/v1 ~/.config/opencode/opencode.json
OpenClaw OpenClaw http://127.0.0.1:2455/v1 ~/.openclaw/openclaw.json
Python OpenAI Python SDK http://127.0.0.1:2455/v1 Code
OpenAICodex CLI / IDE Extension

~/.codex/config.toml:

model = "gpt-5.3-codex"
model_reasoning_effort = "xhigh"
model_provider = "codex-lb"

[model_providers.codex-lb]
name = "OpenAI"  # MUST be "OpenAI" - enables /compact endpoint
base_url = "http://127.0.0.1:2455/backend-api/codex"
wire_api = "responses"
requires_openai_auth = true

With API key auth:

model = "gpt-5.3-codex"
model_reasoning_effort = "xhigh"
model_provider = "codex-lb"

[model_providers.codex-lb]
name = "OpenAI"  # MUST be "OpenAI" - enables /compact endpoint
base_url = "http://127.0.0.1:2455/backend-api/codex"
wire_api = "responses"
env_key = "CODEX_LB_API_KEY"
export CODEX_LB_API_KEY="sk-clb-..."   # key from dashboard
codex
OpenCodeOpenCode

~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "codex-lb": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "codex-lb",
      "options": {
        "baseURL": "http://127.0.0.1:2455/v1"
      },
      "models": {
        "gpt-5.3-codex": { "name": "GPT-5.3 Codex", "reasoning": true, "interleaved": { "field": "reasoning_details" } }
      }
    }
  },
  "model": "codex-lb/gpt-5.3-codex"
}

This keeps OpenCode's default providers/connections available and adds codex-lb as an extra selectable provider.

If you use enabled_providers, include every provider you want to keep plus codex-lb; otherwise non-listed providers are hidden.

With API key auth:

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "codex-lb": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "codex-lb",
      "options": {
        "baseURL": "http://127.0.0.1:2455/v1",
        "apiKey": "{env:CODEX_LB_API_KEY}"   // reads from env var
      },
      "models": {
        "gpt-5.3-codex": { "name": "GPT-5.3 Codex", "reasoning": true, "interleaved": { "field": "reasoning_details" } }
      }
    }
  },
  "model": "codex-lb/gpt-5.3-codex"
}
export CODEX_LB_API_KEY="sk-clb-..."   # key from dashboard
opencode
OpenClawOpenClaw

~/.openclaw/openclaw.json:

{
  "agents": {
    "defaults": {
      "model": { "primary": "codex-lb/gpt-5.3-codex" }
    }
  },
  "models": {
    "mode": "merge",
    "providers": {
      "codex-lb": {
        "baseUrl": "http://127.0.0.1:2455/v1",
        "apiKey": "${CODEX_LB_API_KEY}",   // or "dummy" if API key auth is disabled
        "api": "openai-completions",
        "models": [
          { "id": "gpt-5.3-codex", "name": "GPT-5.3 Codex" },
          { "id": "gpt-5.3-codex-spark", "name": "GPT-5.3 Codex Spark" }
        ]
      }
    }
  }
}

Set the env var or replace ${CODEX_LB_API_KEY} with a key from the dashboard. If API key auth is disabled, any value works.

PythonOpenAI Python SDK
from openai import OpenAI

client = OpenAI(
    base_url="http://127.0.0.1:2455/v1",
    api_key="sk-clb-...",  # from dashboard, or any string if auth is disabled
)

response = client.chat.completions.create(
    model="gpt-5.3-codex",
    messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)

API Key Authentication

API key auth is disabled by default — the proxy is open to any client. Enable it in Settings → API Key Auth on the dashboard.

When enabled, clients must pass a valid API key as a Bearer token:

Authorization: Bearer sk-clb-...

Creating keys: Dashboard → API Keys → Create. The full key is shown only once at creation. Keys support optional expiration, model restrictions, and rate limits (tokens / cost per day / week / month).

Configuration

Environment variables with CODEX_LB_ prefix or .env.local. See .env.example. Dashboard auth is configured in Settings.

Data

Environment Path
Local / uvx ~/.codex-lb/
Docker /var/lib/codex-lb/

Backup this directory to preserve your data.

Development

# Docker
docker compose watch

# Local
uv sync && cd frontend && bun install && cd ..
uv run fastapi run app/main.py --reload        # backend :2455
cd frontend && bun run dev                     # frontend :5173

Contributors ✨

Thanks goes to these wonderful people (emoji key):

Soju06
Soju06

💻 ⚠️ 🚧 🚇
Jonas Kamsker
Jonas Kamsker

💻 🐛 🚧
Quack
Quack

💻 🐛 🚧 🎨
Jill Kok, San Mou
Jill Kok, San Mou

💻 ⚠️ 🚧 🐛
PARK CHANYOUNG
PARK CHANYOUNG

📖
Choi138
Choi138

💻 🐛 ⚠️
LYA⚚CAP⚚OCEAN
LYA⚚CAP⚚OCEAN

💻 ⚠️
Eugene Korekin
Eugene Korekin

💻 🐛 ⚠️
jordan
jordan

💻 🐛 ⚠️

This project follows the all-contributors specification. Contributions of any kind welcome!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codex_lb-1.0.4.tar.gz (2.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codex_lb-1.0.4-py3-none-any.whl (619.3 kB view details)

Uploaded Python 3

File details

Details for the file codex_lb-1.0.4.tar.gz.

File metadata

  • Download URL: codex_lb-1.0.4.tar.gz
  • Upload date:
  • Size: 2.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for codex_lb-1.0.4.tar.gz
Algorithm Hash digest
SHA256 acaef83835b86031d09a565efdf6313e1636e56795ba525e48e559bcf5a1d3c2
MD5 13c44ffd461312001a7be0c84d8602d1
BLAKE2b-256 b14cde39af9c89298d9f557e6215d3a77e34765bbb9415216103dda867d5ccc1

See more details on using hashes here.

Provenance

The following attestation bundles were made for codex_lb-1.0.4.tar.gz:

Publisher: release.yml on Soju06/codex-lb

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file codex_lb-1.0.4-py3-none-any.whl.

File metadata

  • Download URL: codex_lb-1.0.4-py3-none-any.whl
  • Upload date:
  • Size: 619.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for codex_lb-1.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 013946c607e37131791e4d577009888341906f9b281e226a2c59518f07b2ccaa
MD5 c49ee05332a125110a1432ce0b950462
BLAKE2b-256 19d093e346da8648654e0ab3665b901579a097eae19bf006fa286da5d2671e61

See more details on using hashes here.

Provenance

The following attestation bundles were made for codex_lb-1.0.4-py3-none-any.whl:

Publisher: release.yml on Soju06/codex-lb

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page