Skip to main content

Local OpenAI bridge CLI for OpenAI-compatible upstreams

Project description

openai-local-bridge

中文说明

openai-local-bridge routes local api.openai.com requests to a third-party OpenAI-compatible endpoint, allowing tools such as Trae and AI Assistant to use GPT Codex through a non-OpenAI API.

Prerequisites

  • OpenSSL: required when running olb enable or olb start to generate local certificates. On Windows, install OpenSSL first and make sure the directory containing openssl.exe is in PATH.

Runtime check:

openssl version

Installation

If you want a standalone binary with the Python runtime bundled, download the platform archive from GitHub Releases. Those archives do not require Git, Python, uv, or npm; only OpenSSL is still needed for olb enable / olb start.

Method 1: uv

uv tool install openai-local-bridge

Method 2: pip

python -m pip install --user openai-local-bridge

Method 3: npm

npm install -g @duanluan/openai-local-bridge

The npm package downloads the matching standalone binary from GitHub Releases during installation, so runtime use does not require Python or uv.

Method 4: curl / PowerShell

Linux / macOS:

curl -fsSL https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.sh | bash

Windows PowerShell:

irm https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.ps1 | iex

Method 5: standalone binary

Download the matching archive from GitHub Releases, then unpack and run olb directly:

  • olb-linux-x86_64.tar.gz
  • olb-macos-x86_64.tar.gz
  • olb-macos-arm64.tar.gz
  • olb-windows-x86_64.zip

Quick Start

The most direct way to use it is:

olb start

start now runs in the background by default. If you want foreground output for debugging:

olb start --debug
olb start -d

Background logs are written to bridge.log under the config directory and rotate automatically at 1 MiB with 3 backup files.

If the machine has not been configured yet, olb start first runs initialization, then continues with enablement and startup. In interactive mode, it asks for:

  • Base URL
  • API Key
  • Reasoning effort

If you only want to update the active account configuration, run:

olb init

To add another upstream account:

olb account add work

To use another active account for olb start:

olb account use work

To stop the takeover:

olb disable

To stop a running bridge process:

olb stop

To restart the bridge and keep the current run mode:

olb restart

To inspect the local setup without making changes:

olb doctor

To follow the log, showing the latest 10 lines first:

olb log

Common Commands

Command overview:

  • olb: runs initialization when no config exists; otherwise shows the current status
  • init: initial setup or reconfiguration of the active account
  • config: show the active account configuration
  • config-path: show the configuration file path
  • a: shorthand for account
  • account list / account ls: list saved accounts
  • account add <name>: add a new account
  • account edit [name]: edit the active account or the named account
  • account delete <name>: delete an account
  • account use <name>: use the selected account as active
  • status: show the current status
  • enable: install certificates, update hosts, and manage NSS on supported platforms
  • disable: remove the hosts takeover
  • start: if not initialized, run setup first, then execute enable and start the bridge in the background
  • start --debug / start -d: run the bridge in the foreground for debugging
  • restart: restart the bridge; when no flag is passed it keeps the current run mode
  • restart --debug / restart -d: restart the bridge in the foreground
  • reload: legacy alias for restart
  • doctor: inspect local bridge setup without making changes
  • -v / -V / --version: print the installed version
  • log: follow the log file and show the latest 10 lines first
  • stop: stop the current bridge process, including one started in the background

Wrapper Script Entry Points

If you are running directly from the repository, you can also use:

Linux / macOS

./openai-local-bridge.sh <command>

Windows PowerShell

.\openai-local-bridge.ps1 <command>

Windows BAT

openai-local-bridge.bat <command>

All of these entry points forward to the same CLI.

If you install through npm, olb starts the bundled platform binary directly. Supported npm targets are Linux x64, macOS x64, macOS arm64, and Windows x64.

Using It in a Client

Using Trae as an example, the recommended flow is split into two phases.

Phase 1: Add the model in the client first

  1. Keep this project disabled:

    olb disable
    
  2. Confirm that the machine can reach the official OpenAI service.

  3. Add the model in the client, for example:

    • Provider: OpenAI
    • Model: Custom model
    • Model ID: gpt-5.4
    • API Key: your official OpenAI key

Phase 2: Enable the bridge for subsequent requests

olb start

Then choose the model you just added in the client.

FAQ

What does olb do by default?

  • If no configuration file exists, it starts initialization.
  • If configuration is already complete, it shows the current status.

What can I check with olb status?

These fields are usually the most important:

  • hosts: whether takeover is active
  • root_ca: whether the root certificate exists
  • nss: NSS status
  • listener: whether the local listener is running
  • listen_addr: listening address
  • config: configuration file location

Other software is affected too

That is expected with the current approach, because the takeover happens at the system level for api.openai.com.

Restore normal behavior immediately:

olb disable

Model requests fail

Check these items first:

  • Whether Base URL is correct
  • Whether API Key is correct
  • Whether the upstream service is OpenAI-compatible
  • Whether the upstream model you configured actually exists

Failed to modify hosts or import certificates on Windows

This is usually a permission issue. Run the command again in a terminal with sufficient privileges.

Windows says missing command: openssl

The current implementation requires OpenSSL to be installed locally. Install OpenSSL first, then confirm that this works in your terminal:

openssl version

Startup fails on Linux / macOS

If you use the default port 443, the system may require elevated privileges. Follow the prompt, or switch to a higher port.

Configuration File

The CLI writes its configuration to a file under your user configuration directory. The file stores the active account plus all saved upstream accounts.

View the path:

olb config-path

View the current configuration:

olb config

Security Notes

Before using this project, keep in mind:

  • It installs a local certificate on your machine.
  • It modifies the system hosts file.
  • Run olb disable when you are not using it.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_local_bridge-0.4.0.tar.gz (37.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openai_local_bridge-0.4.0-py3-none-any.whl (29.0 kB view details)

Uploaded Python 3

File details

Details for the file openai_local_bridge-0.4.0.tar.gz.

File metadata

  • Download URL: openai_local_bridge-0.4.0.tar.gz
  • Upload date:
  • Size: 37.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for openai_local_bridge-0.4.0.tar.gz
Algorithm Hash digest
SHA256 25a8f182e72d7af5f5211ffff9bffef3d90296b2eec5e6e252b3998049c3ce3d
MD5 eafc9c31ce59f8db86eb48e790d6fcf5
BLAKE2b-256 01f7086607be34e6ae3b42b914b49e601719877847e9db2e0a25f0884dac8275

See more details on using hashes here.

File details

Details for the file openai_local_bridge-0.4.0-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_local_bridge-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cb901ec26ed236ba5e6ae2adc5dfd136c7a3c296092c2f58d7f220651616af28
MD5 bb8ef34c6c6ff64cb17ec225481c1b4a
BLAKE2b-256 be88e419d4b7ca07f2c34642687ef5ad54619196ba8c82fc46774d08b9aba0a8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page