Local OpenAI bridge CLI for OpenAI-compatible upstreams
Project description
openai-local-bridge
openai-local-bridge routes local api.openai.com requests to a third-party OpenAI-compatible endpoint, allowing tools such as Trae and AI Assistant to use GPT Codex through a non-OpenAI API.
Prerequisites
GitPythonuv: optional, but recommended. When available, the npm launcher prefers it so the latest CLI can be run directly.OpenSSL: required when runningolb enableorolb startto generate local certificates. On Windows, install OpenSSL first and make sure the directory containingopenssl.exeis inPATH.
Check your environment:
git --version
python --version
openssl version
Installation
If you want a standalone binary with the Python runtime bundled, download the platform archive from GitHub Releases. Those archives do not require Python or uv; only OpenSSL is still needed for olb enable / olb start.
Method 1: uv
uv tool install openai-local-bridge
Method 2: pip
python -m pip install --user openai-local-bridge
Method 3: npm
npm install -g openai-local-bridge
The npm package downloads the matching standalone binary for the current platform, so runtime use does not require Python or uv.
Method 4: curl / PowerShell
Linux / macOS:
curl -fsSL https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.sh | bash
Windows PowerShell:
irm https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.ps1 | iex
Method 5: standalone binary
Download the matching archive from GitHub Releases, then unpack and run olb directly:
olb-linux-x86_64.tar.gzolb-macos-x86_64.tar.gzolb-macos-arm64.tar.gzolb-windows-x86_64.zip
Quick Start
The most direct way to use it is:
olb start
Run it in the background:
olb start --background
If the machine has not been configured yet, olb start first runs initialization, then continues with enablement and startup. In interactive mode, it asks for:
Base URLAPI KeyReasoning effort
If you only want to update the configuration, run:
olb init
To stop the takeover:
olb disable
To stop a running bridge process:
olb stop
Common Commands
Command overview:
olb: runs initialization when no config exists; otherwise shows the current statusinit: initial setup or reconfigurationconfig: show the current configurationconfig-path: show the configuration file pathstatus: show the current statusenable: install certificates, update hosts, and manage NSS on supported platformsdisable: remove the hosts takeoverstart: if not initialized, run setup first, then executeenableand start the bridge immediatelystart --background: start the bridge in the background and write logs to the config directorystop: stop the current bridge process, including one started in the background
Wrapper Script Entry Points
If you are running directly from the repository, you can also use:
Linux / macOS
./openai-local-bridge.sh <command>
Windows PowerShell
.\openai-local-bridge.ps1 <command>
Windows BAT
openai-local-bridge.bat <command>
All of these entry points forward to the same CLI.
If you install through npm, olb starts the bundled platform binary directly. Supported npm targets are Linux x64, macOS x64, macOS arm64, and Windows x64.
Release
The release workflow lives in .github/workflows/release-binaries.yml.
Before pushing a release tag:
- set matching versions in
pyproject.tomlandpackage.json - configure GitHub secrets
PYPI_API_TOKENandNPM_TOKEN - push a tag such as
v0.2.2
When the workflow runs on that tag, it:
- builds standalone binaries for Linux x64, macOS x64, macOS arm64, and Windows x64
- uploads those binaries to GitHub Releases
- publishes the Python package to PyPI
- publishes the root npm package and all platform npm binary packages
Using It in a Client
Using Trae as an example, the recommended flow is split into two phases.
Phase 1: Add the model in the client first
-
Keep this project disabled:
olb disable -
Confirm that the machine can reach the official OpenAI service.
-
Add the model in the client, for example:
- Provider:
OpenAI - Model:
Custom model - Model ID:
gpt-5.4 - API Key: your official OpenAI key
- Provider:
Phase 2: Enable the bridge for subsequent requests
olb start
Then choose the model you just added in the client.
FAQ
What does olb do by default?
- If no configuration file exists, it starts initialization.
- If configuration is already complete, it shows the current status.
What can I check with olb status?
These fields are usually the most important:
hosts: whether takeover is activeroot_ca: whether the root certificate existsnss: NSS statuslistener: whether the local listener is runninglisten_addr: listening addressconfig: configuration file location
Other software is affected too
That is expected with the current approach, because the takeover happens at the system level for api.openai.com.
Restore normal behavior immediately:
olb disable
Model requests fail
Check these items first:
- Whether
Base URLis correct - Whether
API Keyis correct - Whether the upstream service is OpenAI-compatible
- Whether the upstream model you configured actually exists
Failed to modify hosts or import certificates on Windows
This is usually a permission issue. Run the command again in a terminal with sufficient privileges.
Windows says missing command: openssl
The current implementation requires OpenSSL to be installed locally. Install OpenSSL first, then confirm that this works in your terminal:
openssl version
Startup fails on Linux / macOS
If you use the default port 443, the system may require elevated privileges. Follow the prompt, or switch to a higher port.
Configuration File
The CLI writes its configuration to a file under your user configuration directory.
View the path:
olb config-path
View the current configuration:
olb config
Security Notes
Before using this project, keep in mind:
- It installs a local certificate on your machine.
- It modifies the system
hostsfile. - Run
olb disablewhen you are not using it.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openai_local_bridge-0.2.2.tar.gz.
File metadata
- Download URL: openai_local_bridge-0.2.2.tar.gz
- Upload date:
- Size: 20.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3750a921358920f0dd3243cfd1f6f7d9e83547f0fa3e2ead334118b7a9b17f09
|
|
| MD5 |
c7f8677c625fde423422dff915f68d74
|
|
| BLAKE2b-256 |
7baf0eefeca8c727ac4d226860d89f973fcc2b9e1a6788d47bbf4557cf948811
|
File details
Details for the file openai_local_bridge-0.2.2-py3-none-any.whl.
File metadata
- Download URL: openai_local_bridge-0.2.2-py3-none-any.whl
- Upload date:
- Size: 17.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
63e09059d7a73132571afcdfa6752808afcb685415d68dfd72db4048b2fd8f35
|
|
| MD5 |
2bb512141b331179d85b2b9766f64001
|
|
| BLAKE2b-256 |
fe51a0f8fa11fb43c3ded14a5449c350b5df4b4b47b161597ae997e332c40faa
|