Skip to main content

Local OpenAI bridge CLI for OpenAI-compatible upstreams

Project description

openai-local-bridge

中文说明

openai-local-bridge routes local api.openai.com requests to a third-party OpenAI-compatible endpoint, allowing tools such as Trae and AI Assistant to use GPT Codex through a non-OpenAI API.

Prerequisites

  • Git
  • Python
  • uv: optional, but recommended. When available, the npm launcher prefers it so the latest CLI can be run directly.
  • OpenSSL: required when running olb enable or olb start to generate local certificates. On Windows, install OpenSSL first and make sure the directory containing openssl.exe is in PATH.

Check your environment:

git --version
python --version
openssl version

Installation

If you want a standalone binary with the Python runtime bundled, download the platform archive from GitHub Releases. Those archives do not require Python or uv; only OpenSSL is still needed for olb enable / olb start.

Method 1: uv

uv tool install openai-local-bridge

Method 2: pip

python -m pip install --user openai-local-bridge

Method 3: npm

npm install -g openai-local-bridge

The npm package downloads the matching standalone binary from GitHub Releases during installation, so runtime use does not require Python or uv.

Method 4: curl / PowerShell

Linux / macOS:

curl -fsSL https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.sh | bash

Windows PowerShell:

irm https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.ps1 | iex

Method 5: standalone binary

Download the matching archive from GitHub Releases, then unpack and run olb directly:

  • olb-linux-x86_64.tar.gz
  • olb-macos-x86_64.tar.gz
  • olb-macos-arm64.tar.gz
  • olb-windows-x86_64.zip

Quick Start

The most direct way to use it is:

olb start

Run it in the background:

olb start --background

If the machine has not been configured yet, olb start first runs initialization, then continues with enablement and startup. In interactive mode, it asks for:

  • Base URL
  • API Key
  • Reasoning effort

If you only want to update the configuration, run:

olb init

To stop the takeover:

olb disable

To stop a running bridge process:

olb stop

Common Commands

Command overview:

  • olb: runs initialization when no config exists; otherwise shows the current status
  • init: initial setup or reconfiguration
  • config: show the current configuration
  • config-path: show the configuration file path
  • status: show the current status
  • enable: install certificates, update hosts, and manage NSS on supported platforms
  • disable: remove the hosts takeover
  • start: if not initialized, run setup first, then execute enable and start the bridge immediately
  • start --background: start the bridge in the background and write logs to the config directory
  • stop: stop the current bridge process, including one started in the background

Wrapper Script Entry Points

If you are running directly from the repository, you can also use:

Linux / macOS

./openai-local-bridge.sh <command>

Windows PowerShell

.\openai-local-bridge.ps1 <command>

Windows BAT

openai-local-bridge.bat <command>

All of these entry points forward to the same CLI.

If you install through npm, olb starts the bundled platform binary directly. Supported npm targets are Linux x64, macOS x64, macOS arm64, and Windows x64.

Release

The release workflow lives in .github/workflows/release-binaries.yml.

Before pushing a release tag:

  • set matching versions in pyproject.toml and package.json
  • configure GitHub secrets PYPI_API_TOKEN and NPM_TOKEN
  • push a tag such as v0.2.2

When the workflow runs on that tag, it:

  • builds standalone binaries for Linux x64, macOS x64, macOS arm64, and Windows x64
  • uploads those binaries to GitHub Releases
  • publishes the Python package to PyPI
  • publishes the root npm package; the npm installer downloads the matching binary from GitHub Releases

Using It in a Client

Using Trae as an example, the recommended flow is split into two phases.

Phase 1: Add the model in the client first

  1. Keep this project disabled:

    olb disable
    
  2. Confirm that the machine can reach the official OpenAI service.

  3. Add the model in the client, for example:

    • Provider: OpenAI
    • Model: Custom model
    • Model ID: gpt-5.4
    • API Key: your official OpenAI key

Phase 2: Enable the bridge for subsequent requests

olb start

Then choose the model you just added in the client.

FAQ

What does olb do by default?

  • If no configuration file exists, it starts initialization.
  • If configuration is already complete, it shows the current status.

What can I check with olb status?

These fields are usually the most important:

  • hosts: whether takeover is active
  • root_ca: whether the root certificate exists
  • nss: NSS status
  • listener: whether the local listener is running
  • listen_addr: listening address
  • config: configuration file location

Other software is affected too

That is expected with the current approach, because the takeover happens at the system level for api.openai.com.

Restore normal behavior immediately:

olb disable

Model requests fail

Check these items first:

  • Whether Base URL is correct
  • Whether API Key is correct
  • Whether the upstream service is OpenAI-compatible
  • Whether the upstream model you configured actually exists

Failed to modify hosts or import certificates on Windows

This is usually a permission issue. Run the command again in a terminal with sufficient privileges.

Windows says missing command: openssl

The current implementation requires OpenSSL to be installed locally. Install OpenSSL first, then confirm that this works in your terminal:

openssl version

Startup fails on Linux / macOS

If you use the default port 443, the system may require elevated privileges. Follow the prompt, or switch to a higher port.

Configuration File

The CLI writes its configuration to a file under your user configuration directory.

View the path:

olb config-path

View the current configuration:

olb config

Security Notes

Before using this project, keep in mind:

  • It installs a local certificate on your machine.
  • It modifies the system hosts file.
  • Run olb disable when you are not using it.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_local_bridge-0.2.3.tar.gz (19.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openai_local_bridge-0.2.3-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file openai_local_bridge-0.2.3.tar.gz.

File metadata

  • Download URL: openai_local_bridge-0.2.3.tar.gz
  • Upload date:
  • Size: 19.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for openai_local_bridge-0.2.3.tar.gz
Algorithm Hash digest
SHA256 0ef2c2a17fe434cf74ed40474c0429d93731bf35851dfcbce4e96360afd4875d
MD5 be4a4ec6c6b841164f5a77a3ee2e64b2
BLAKE2b-256 97f141204d1dd66cf0604218f8d9a8a2c74e86fa71b1dbebc4bf0bb1c50ca0fd

See more details on using hashes here.

File details

Details for the file openai_local_bridge-0.2.3-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_local_bridge-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 78a99c156af2163fec022f56c539ebe5aaa40a77ecdff105156727d7ef5e8c77
MD5 9dcc5a370bc60c334aa6583426a7eae7
BLAKE2b-256 f2be2c44d62484357ad3767b058de5c7e23751b89bc574bbb5333669b58d03c1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page