Local backend bridge for Claude Code and Codex.
Project description
Hit your limit? Need privacy? Just swap the model.
One alias. Claude Code or Codex on a local model. Skills, agents, MCP servers — all intact.
Quota hit mid-session?
cckeeps you going on a local model, no context lost. Code that can't leave your machine? Everything runs offline after model download. Don't want to rewire your workflow? Your~/.claude, skills, agents, and MCP servers carry over untouched.
Get Started → · Landing page →
Features
| Feature | What you get |
|---|---|
| Ollama first-class | ollama launch — no duplicated config, no custom Modelfiles |
| Config untouched | All skills, statusline, agents, plugins, and MCP servers carry over |
| Smart model selection | llmfit analyses your hardware and picks the best quantization that fits (optional — wizard prompts to install only when needed) |
| Resume on failure | Wizard persists progress — --resume picks up from the last completed step |
| Idempotent aliases | Re-running the wizard replaces the existing alias block, never appends |
| Cloud fallback | Run claude / codex directly (no prefix) to switch back instantly |
Quick Start
Install from PyPI (recommended)
pip install claude-codex-local
Or with uv:
uv tool install claude-codex-local
Then run the setup wizard:
ccl
One-command install (no clone required)
bash <(curl -sSL https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
Or with wget:
bash <(wget -qO- https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
Use
bash <(...), notcurl … | bash. The wizard is interactive and needs a real TTY — piping steals stdin.
Override defaults with env vars:
CCL_REF=v0.9.0 CCL_INSTALL_DIR=~/tools/claude-codex-local \
bash <(curl -sSL https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
Install from a clone
git clone https://github.com/luongnv89/claude-codex-local.git
cd claude-codex-local
python3 -m venv .venv && source .venv/bin/activate
pip install -e .
ccl
After setup
Reload your shell so the alias is available:
source ~/.zshrc # or source ~/.bashrc
Then run:
cc # Claude Code → local model
cx # Codex CLI → local model
Wizard Steps
graph TD
A[1. Discover environment] --> B[2. Install missing components]
B --> C[3. Pick harness + engine]
C --> D[4. Pick model]
D --> E[5. Smoke test engine]
E --> F[6. Wire harness]
F --> G[7. Install helper + aliases]
G --> H[8. Verify launch end-to-end]
H --> I[9. Generate guide.md]
See guide.example.md for the personalized daily-use guide the wizard generates.
Usage
ccl # run the interactive first-run wizard
ccl setup --harness claude --engine ollama # skip the prefs picker
ccl setup --non-interactive # CI-friendly install
ccl setup --resume # resume after a failure
ccl find-model # standalone model recommendation
ccl doctor # wizard state + presence check
ccl --version # print version and exit
Advanced / debug (no user binary — run as a Python module):
python -m claude_codex_local.core profile # full hardware profile as JSON
python -m claude_codex_local.core recommend # llmfit-only model recommendation
python -m claude_codex_local.core adapters # list all engine adapters
Prerequisites
- macOS or Linux with zsh or bash
- Python 3.10+
- At least one harness: Claude Code or Codex CLI
- At least one engine: Ollama (recommended), LM Studio, vLLM, llama.cpp, or 9router (cloud-routing proxy)
llmfitonPATH(optional — for automatic model selection)
Proven Paths
| Harness | Engine | Model | Status |
|---|---|---|---|
| Claude Code | Ollama | gemma4:26b |
Verified end-to-end |
| Codex CLI | Ollama | gemma4:26b |
Verified |
| Claude Code | LM Studio | Qwen3 family | Blocked — 400 thinking.type; wizard warns and recommends alternatives |
| Any | llama.cpp | any | Inline-env code path exists, no live proof yet |
| Any | vLLM | any | New in 0.8.0 — adapter shipped with tests |
| Claude Code | 9router | kr/claude-sonnet-4.5 |
New in 0.9.0 — cloud-routed via cc9 alias; existing cc is untouched |
| Codex CLI | 9router | kr/claude-sonnet-4.5 |
New in 0.9.0 — cloud-routed via cx9 alias; existing cx is untouched |
9router quick-start
9router is a local proxy that exposes an OpenAI-compatible API on http://localhost:20128/v1 and routes calls to cloud models such as kr/claude-sonnet-4.5. Picking 9router as the engine adds a new cc9 (Claude) or cx9 (Codex) alias and leaves your existing cc / cx aliases untouched.
Installing and running 9router
Step 1: Install 9router
# Using npm (recommended)
npm install -g 9router
# Or using yarn
yarn global add 9router
# Or using pnpm
pnpm add -g 9router
Step 2: Get your API key
- Visit the 9router dashboard and sign up or log in
- Navigate to API Keys section
- Create a new API key and copy it
Step 3: Start the 9router service
# Start 9router with your API key
9router start --api-key YOUR_API_KEY_HERE
# Or set it as an environment variable
export ROUTER9_API_KEY=YOUR_API_KEY_HERE
9router start
# The service will start on http://localhost:20128
Step 4: Verify 9router is running
# Check if the service is responding
curl http://localhost:20128/v1/models
# You should see a list of available models
Step 5: Configure CCL to use 9router
# Interactive setup (wizard will prompt for API key)
ccl setup --engine 9router
# Non-interactive (CI / scripted):
CCL_9ROUTER_API_KEY=<paste-here> CCL_9ROUTER_MODEL=kr/claude-sonnet-4.5 \
ccl setup --engine 9router --harness claude --non-interactive
How the wizard configures 9router
The wizard:
- Asks for the 9router API key and writes it to
~/.claude-codex-local/9router-api-keywithchmod 0600. The helper script reads this file at exec time via$(cat …)— the key is never embedded in the script body or wizard state file. - Verifies reachability via
GET /v1/modelsonly. It deliberately does not call/chat/completionsduring smoke-test or verify, because 9router routes to paid cloud models. The verification record is{"ok": true, "via": "9router-models-endpoint", "skipped_chat": true}. - Installs
cc9(orcx9) into your shell rc as a new fenced block (# >>> claude-codex-local:claude9 >>>), leaving any existingcc/cxblock alone.
Tip: cc9 and cc can coexist on the same machine — pick cc9 when you want to burn cloud quota for a tough prompt, and cc (Ollama / LM Studio / llama.cpp) for everyday work.
Claude Code → 9router env vars
| Env var | 9router |
|---|---|
ANTHROPIC_BASE_URL |
http://localhost:20128/v1 |
ANTHROPIC_AUTH_TOKEN |
$(cat ~/.claude-codex-local/9router-api-key) (read at exec) |
ANTHROPIC_API_KEY |
$(cat ~/.claude-codex-local/9router-api-key) (read at exec) |
ANTHROPIC_CUSTOM_MODEL_OPTION |
<tag> (e.g. kr/claude-sonnet-4.5) |
ANTHROPIC_CUSTOM_MODEL_OPTION_NAME |
9router <tag> |
CLAUDE_CODE_ATTRIBUTION_HEADER |
"0" |
CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC |
"1" |
For Codex: OPENAI_BASE_URL=http://localhost:20128/v1, OPENAI_API_KEY=$(cat …).
Rollback
# Remove the fenced block from ~/.zshrc (between the marker lines)
rm -rf .claude-codex-local
Each fence block (claude / codex / claude9 / codex9) is independent — you can remove just one without touching the others. Your ~/.claude and ~/.codex are unchanged.
Architecture details
Three layers
-
Machine profile + model recommendation (
claude_codex_local/core.py) — dumps a JSON snapshot of installed harnesses/engines/llmfit/disk, runsllmfitfor ranked model recommendations, and provides adoctorcommand for pretty-printing wizard state. -
Interactive wizard (
claude_codex_local/wizard.py) — 9 steps from discovery to ready-to-use daily alias. Persists progress in.claude-codex-local/wizard-state.jsonso--resumepicks up after a failure. -
Helper scripts + shell aliases —
.claude-codex-local/bin/cc(orcx) is a short bash wrapper. For Ollama it runsollama launch claude|codex --model <tag>. For LM Studio / llama.cpp it sets inline env vars and execs the real harness. A fenced block in~/.zshrc/~/.bashrcdeclares the aliases.
Why ollama launch
ollama launch claude --model <tag> is an official Ollama subcommand that sets the right env vars internally and execs the user's real claude binary against the local daemon — using ~/.claude as-is.
This means:
- No duplicated
~/.claudedirectory - No custom Modelfile or
ollama create - No
ANTHROPIC_CUSTOM_MODEL_OPTIONto manage manually ccjust works
Claude Code → LM Studio / llama.cpp env vars
| Env var | LM Studio | llama.cpp |
|---|---|---|
ANTHROPIC_BASE_URL |
http://localhost:1234 |
http://localhost:8001 |
ANTHROPIC_API_KEY |
lmstudio |
sk-local |
ANTHROPIC_CUSTOM_MODEL_OPTION |
<tag> |
<tag> |
ANTHROPIC_CUSTOM_MODEL_OPTION_NAME |
Local (lmstudio) <tag> |
Local (llamacpp) <tag> |
CLAUDE_CODE_ATTRIBUTION_HEADER |
"0" |
"0" |
CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC |
"1" |
"1" |
Codex CLI → Ollama
ollama launch codex --model <tag> -- --oss --local-provider=ollama
The --oss --local-provider=ollama flags are required after -- because Codex otherwise tries to route through the ChatGPT account and rejects non-OpenAI model names.
Project structure
.
├── claude_codex_local/
│ ├── __init__.py # Package metadata + __version__
│ ├── wizard.py # Interactive setup wizard + `ccl` CLI
│ └── core.py # Machine profile, engine adapters, llmfit bindings
├── scripts/
│ └── e2e_smoke.sh # End-to-end smoke test
├── docs/
│ ├── poc-wizard.md # 9-step wizard architecture
│ ├── poc-architecture.md # System design overview
│ ├── poc-bootstrap.md # Bootstrap / install flow
│ └── poc-proof.md # Design rationale
├── tests/ # pytest test suite
├── install.sh # One-command remote installer
└── pyproject.toml # Project metadata and tool config
Tech stack
| Layer | Tool |
|---|---|
| Language | Python 3.10+ |
| UI / prompts | questionary, rich |
| Linting | ruff |
| Type checking | mypy |
| Testing | pytest + pytest-cov |
| Security | bandit, detect-secrets |
| Pre-commit | pre-commit |
Local state
Everything written by the bridge goes under .claude-codex-local/. Override with CLAUDE_CODEX_LOCAL_STATE_DIR.
Contributing
Contributions are welcome. Read CONTRIBUTING.md before opening a PR.
For security issues, see SECURITY.md.
MIT — © 2026 Luong NGUYEN
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file claude_codex_local-0.9.0.tar.gz.
File metadata
- Download URL: claude_codex_local-0.9.0.tar.gz
- Upload date:
- Size: 101.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2bc2b2d5b6bac26519454cdde84235ac8e9dd74e5b3c51e514424af9b320ffcf
|
|
| MD5 |
d09e682dfcf5dc8ff25f4590404395ea
|
|
| BLAKE2b-256 |
05e81464b2cb3f20f56c569987c2e7f8f9ae84143767f4a66a6a40258222559f
|
File details
Details for the file claude_codex_local-0.9.0-py3-none-any.whl.
File metadata
- Download URL: claude_codex_local-0.9.0-py3-none-any.whl
- Upload date:
- Size: 55.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3ce40adfcb3c3aa9fcd6a691f135f1d975291aa365657f3d88812b43dfe0fdaa
|
|
| MD5 |
2b0deac7b6b2b8b165b6c95d2e85bc6d
|
|
| BLAKE2b-256 |
bbdadc72de6dd6e32e78da6244b784e4b9a6ba6db39d931d0219b0da3f40035d
|