Local OpenAI bridge CLI for OpenAI-compatible upstreams
Project description
openai-local-bridge
openai-local-bridge routes local api.openai.com requests to a third-party OpenAI-compatible endpoint, allowing tools such as Trae and AI Assistant to use GPT Codex through a non-OpenAI API.
Prerequisites
OpenSSL: required when runningolb enableorolb startto generate local certificates. On Windows, install OpenSSL first and make sure the directory containingopenssl.exeis inPATH.
Runtime check:
openssl version
Installation
If you want a standalone binary with the Python runtime bundled, download the platform archive from GitHub Releases. Those archives do not require Git, Python, uv, or npm; only OpenSSL is still needed for olb enable / olb start.
Method 1: uv
uv tool install openai-local-bridge
Method 2: pip
python -m pip install --user openai-local-bridge
Method 3: npm
npm install -g @duanluan/openai-local-bridge
The npm package downloads the matching standalone binary from GitHub Releases during installation, so runtime use does not require Python or uv.
Method 4: curl / PowerShell
Linux / macOS:
curl -fsSL https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.sh | bash
Windows PowerShell:
irm https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.ps1 | iex
Method 5: standalone binary
Download the matching archive from GitHub Releases, then unpack and run olb directly:
olb-linux-x86_64.tar.gzolb-macos-x86_64.tar.gzolb-macos-arm64.tar.gzolb-windows-x86_64.zip
Quick Start
The most direct way to use it is:
olb start
start now runs in the background by default. If you want foreground output for debugging:
olb start --debug
olb start -d
Background logs are written to bridge.log under the config directory and rotate automatically at 1 MiB with 3 backup files.
If the machine has not been configured yet, olb start first runs initialization, then continues with enablement and startup. In interactive mode, it asks for:
Base URLAPI KeyReasoning effort
If you only want to update the active account configuration, run:
olb init
To add another upstream account:
olb account add work
To use another active account for olb start:
olb account use work
To stop the takeover:
olb disable
To stop a running bridge process:
olb stop
To restart the bridge and keep the current run mode:
olb restart
To inspect the local setup without making changes:
olb doctor
To follow the log, showing the latest 10 lines first:
olb log
Common Commands
Command overview:
olb: runs initialization when no config exists; otherwise shows the current statusinit: initial setup or reconfiguration of the active accountconfig: show the active account configurationconfig-path: show the configuration file patha: shorthand foraccountaccount list/account ls: list saved accountsaccount add <name>: add a new accountaccount edit [name]: edit the active account or the named accountaccount delete <name>: delete an accountaccount use <name>: use the selected account as activestatus: show the current statusenable: install certificates, update hosts, and manage NSS on supported platformsdisable: remove the hosts takeoverstart: if not initialized, run setup first, then executeenableand start the bridge in the backgroundstart --debug/start -d: run the bridge in the foreground for debuggingrestart: restart the bridge; when no flag is passed it keeps the current run moderestart --debug/restart -d: restart the bridge in the foregroundreload: legacy alias forrestartdoctor: inspect local bridge setup without making changes-v/-V/--version: print the installed versionlog: follow the log file and show the latest 10 lines firststop: stop the current bridge process, including one started in the background
Wrapper Script Entry Points
If you are running directly from the repository, you can also use:
Linux / macOS
./openai-local-bridge.sh <command>
Windows PowerShell
.\openai-local-bridge.ps1 <command>
Windows BAT
openai-local-bridge.bat <command>
All of these entry points forward to the same CLI.
If you install through npm, olb starts the bundled platform binary directly. Supported npm targets are Linux x64, macOS x64, macOS arm64, and Windows x64.
Using It in a Client
Using Trae as an example, the recommended flow is split into two phases.
Phase 1: Add the model in the client first
-
Keep this project disabled:
olb disable -
Confirm that the machine can reach the official OpenAI service.
-
Add the model in the client, for example:
- Provider:
OpenAI - Model:
Custom model - Model ID:
gpt-5.4 - API Key: your official OpenAI key
- Provider:
Phase 2: Enable the bridge for subsequent requests
olb start
Then choose the model you just added in the client.
FAQ
What does olb do by default?
- If no configuration file exists, it starts initialization.
- If configuration is already complete, it shows the current status.
What can I check with olb status?
These fields are usually the most important:
hosts: whether takeover is activeroot_ca: whether the root certificate existsnss: NSS statuslistener: whether the local listener is runninglisten_addr: listening addressconfig: configuration file location
Other software is affected too
That is expected with the current approach, because the takeover happens at the system level for api.openai.com.
Restore normal behavior immediately:
olb disable
Model requests fail
Check these items first:
- Whether
Base URLis correct - Whether
API Keyis correct - Whether the upstream service is OpenAI-compatible
- Whether the upstream model you configured actually exists
Failed to modify hosts or import certificates on Windows
This is usually a permission issue. Run the command again in a terminal with sufficient privileges.
Windows says missing command: openssl
The current implementation requires OpenSSL to be installed locally. Install OpenSSL first, then confirm that this works in your terminal:
openssl version
Startup fails on Linux / macOS
If you use the default port 443, the system may require elevated privileges. Follow the prompt, or switch to a higher port.
Configuration File
The CLI writes its configuration to a file under your user configuration directory. The file stores the active account plus all saved upstream accounts.
View the path:
olb config-path
View the current configuration:
olb config
Security Notes
Before using this project, keep in mind:
- It installs a local certificate on your machine.
- It modifies the system
hostsfile. - Run
olb disablewhen you are not using it.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openai_local_bridge-0.4.0.tar.gz.
File metadata
- Download URL: openai_local_bridge-0.4.0.tar.gz
- Upload date:
- Size: 37.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
25a8f182e72d7af5f5211ffff9bffef3d90296b2eec5e6e252b3998049c3ce3d
|
|
| MD5 |
eafc9c31ce59f8db86eb48e790d6fcf5
|
|
| BLAKE2b-256 |
01f7086607be34e6ae3b42b914b49e601719877847e9db2e0a25f0884dac8275
|
File details
Details for the file openai_local_bridge-0.4.0-py3-none-any.whl.
File metadata
- Download URL: openai_local_bridge-0.4.0-py3-none-any.whl
- Upload date:
- Size: 29.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cb901ec26ed236ba5e6ae2adc5dfd136c7a3c296092c2f58d7f220651616af28
|
|
| MD5 |
bb8ef34c6c6ff64cb17ec225481c1b4a
|
|
| BLAKE2b-256 |
be88e419d4b7ca07f2c34642687ef5ad54619196ba8c82fc46774d08b9aba0a8
|