Skip to main content

Local OpenAI bridge CLI for OpenAI-compatible upstreams

Project description

openai-local-bridge

中文说明

openai-local-bridge routes local api.openai.com requests to a third-party OpenAI-compatible endpoint, allowing tools such as Trae and AI Assistant to use GPT Codex through a non-OpenAI API.

Prerequisites

  • Git
  • Python
  • uv: optional, but recommended. When available, the npm launcher prefers it so the latest CLI can be run directly.
  • OpenSSL: required when running olb enable or olb start to generate local certificates. On Windows, install OpenSSL first and make sure the directory containing openssl.exe is in PATH.

Check your environment:

git --version
python --version
openssl version

Installation

If you want a standalone binary with the Python runtime bundled, download the platform archive from GitHub Releases. Those archives do not require Python or uv; only OpenSSL is still needed for olb enable / olb start.

Method 1: uv

uv tool install openai-local-bridge

Method 2: pip

python -m pip install --user openai-local-bridge

Method 3: npm

npm install -g openai-local-bridge

The npm package downloads the matching standalone binary for the current platform, so runtime use does not require Python or uv.

Method 4: curl / PowerShell

Linux / macOS:

curl -fsSL https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.sh | bash

Windows PowerShell:

irm https://raw.githubusercontent.com/duanluan/openai-local-bridge/main/install.ps1 | iex

Method 5: standalone binary

Download the matching archive from GitHub Releases, then unpack and run olb directly:

  • olb-linux-x86_64.tar.gz
  • olb-macos-x86_64.tar.gz
  • olb-macos-arm64.tar.gz
  • olb-windows-x86_64.zip

Quick Start

The most direct way to use it is:

olb start

Run it in the background:

olb start --background

If the machine has not been configured yet, olb start first runs initialization, then continues with enablement and startup. In interactive mode, it asks for:

  • Base URL
  • API Key
  • Reasoning effort

If you only want to update the configuration, run:

olb init

To stop the takeover:

olb disable

To stop a running bridge process:

olb stop

Common Commands

Command overview:

  • olb: runs initialization when no config exists; otherwise shows the current status
  • init: initial setup or reconfiguration
  • config: show the current configuration
  • config-path: show the configuration file path
  • status: show the current status
  • enable: install certificates, update hosts, and manage NSS on supported platforms
  • disable: remove the hosts takeover
  • start: if not initialized, run setup first, then execute enable and start the bridge immediately
  • start --background: start the bridge in the background and write logs to the config directory
  • stop: stop the current bridge process, including one started in the background

Wrapper Script Entry Points

If you are running directly from the repository, you can also use:

Linux / macOS

./openai-local-bridge.sh <command>

Windows PowerShell

.\openai-local-bridge.ps1 <command>

Windows BAT

openai-local-bridge.bat <command>

All of these entry points forward to the same CLI.

If you install through npm, olb starts the bundled platform binary directly. Supported npm targets are Linux x64, macOS x64, macOS arm64, and Windows x64.

Release

The release workflow lives in .github/workflows/release-binaries.yml.

Before pushing a release tag:

  • set matching versions in pyproject.toml and package.json
  • configure GitHub secrets PYPI_API_TOKEN and NPM_TOKEN
  • push a tag such as v0.2.2

When the workflow runs on that tag, it:

  • builds standalone binaries for Linux x64, macOS x64, macOS arm64, and Windows x64
  • uploads those binaries to GitHub Releases
  • publishes the Python package to PyPI
  • publishes the root npm package and all platform npm binary packages

Using It in a Client

Using Trae as an example, the recommended flow is split into two phases.

Phase 1: Add the model in the client first

  1. Keep this project disabled:

    olb disable
    
  2. Confirm that the machine can reach the official OpenAI service.

  3. Add the model in the client, for example:

    • Provider: OpenAI
    • Model: Custom model
    • Model ID: gpt-5.4
    • API Key: your official OpenAI key

Phase 2: Enable the bridge for subsequent requests

olb start

Then choose the model you just added in the client.

FAQ

What does olb do by default?

  • If no configuration file exists, it starts initialization.
  • If configuration is already complete, it shows the current status.

What can I check with olb status?

These fields are usually the most important:

  • hosts: whether takeover is active
  • root_ca: whether the root certificate exists
  • nss: NSS status
  • listener: whether the local listener is running
  • listen_addr: listening address
  • config: configuration file location

Other software is affected too

That is expected with the current approach, because the takeover happens at the system level for api.openai.com.

Restore normal behavior immediately:

olb disable

Model requests fail

Check these items first:

  • Whether Base URL is correct
  • Whether API Key is correct
  • Whether the upstream service is OpenAI-compatible
  • Whether the upstream model you configured actually exists

Failed to modify hosts or import certificates on Windows

This is usually a permission issue. Run the command again in a terminal with sufficient privileges.

Windows says missing command: openssl

The current implementation requires OpenSSL to be installed locally. Install OpenSSL first, then confirm that this works in your terminal:

openssl version

Startup fails on Linux / macOS

If you use the default port 443, the system may require elevated privileges. Follow the prompt, or switch to a higher port.

Configuration File

The CLI writes its configuration to a file under your user configuration directory.

View the path:

olb config-path

View the current configuration:

olb config

Security Notes

Before using this project, keep in mind:

  • It installs a local certificate on your machine.
  • It modifies the system hosts file.
  • Run olb disable when you are not using it.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_local_bridge-0.2.2.tar.gz (20.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openai_local_bridge-0.2.2-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file openai_local_bridge-0.2.2.tar.gz.

File metadata

  • Download URL: openai_local_bridge-0.2.2.tar.gz
  • Upload date:
  • Size: 20.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for openai_local_bridge-0.2.2.tar.gz
Algorithm Hash digest
SHA256 3750a921358920f0dd3243cfd1f6f7d9e83547f0fa3e2ead334118b7a9b17f09
MD5 c7f8677c625fde423422dff915f68d74
BLAKE2b-256 7baf0eefeca8c727ac4d226860d89f973fcc2b9e1a6788d47bbf4557cf948811

See more details on using hashes here.

File details

Details for the file openai_local_bridge-0.2.2-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_local_bridge-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 63e09059d7a73132571afcdfa6752808afcb685415d68dfd72db4048b2fd8f35
MD5 2bb512141b331179d85b2b9766f64001
BLAKE2b-256 fe51a0f8fa11fb43c3ded14a5449c350b5df4b4b47b161597ae997e332c40faa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page