Local backend bridge for Claude Code and Codex.
Project description
Hit your limit? Need privacy? Just swap the model.
One alias. Claude Code or Codex on a local model. Skills, agents, MCP servers — all intact.
Quota hit mid-session?
cckeeps you going on a local model, no context lost. Code that can't leave your machine? Everything runs offline after model download. Don't want to rewire your workflow? Your~/.claude, skills, agents, and MCP servers carry over untouched.
Get Started → · Landing page →
Features
| Feature | What you get |
|---|---|
| Ollama first-class | ollama launch — no duplicated config, no custom Modelfiles |
| Config untouched | All skills, statusline, agents, plugins, and MCP servers carry over |
| Smart model selection | llmfit analyses your hardware and picks the best quantization that fits (optional — wizard prompts to install only when needed) |
| Resume on failure | Wizard persists progress — --resume picks up from the last completed step |
| Idempotent aliases | Re-running the wizard replaces the existing alias block, never appends |
| Cloud fallback | Run claude / codex directly (no prefix) to switch back instantly |
Quick Start
Install from PyPI (recommended)
pip install claude-codex-local
Or with uv:
uv tool install claude-codex-local
Then run the setup wizard:
ccl
One-command install (no clone required)
bash <(curl -sSL https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
Or with wget:
bash <(wget -qO- https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
Use
bash <(...), notcurl … | bash. The wizard is interactive and needs a real TTY — piping steals stdin.
Override defaults with env vars:
CCL_REF=v0.6.0 CCL_INSTALL_DIR=~/tools/claude-codex-local \
bash <(curl -sSL https://raw.githubusercontent.com/luongnv89/claude-codex-local/main/install.sh)
Install from a clone
git clone https://github.com/luongnv89/claude-codex-local.git
cd claude-codex-local
python3 -m venv .venv && source .venv/bin/activate
pip install -e .
ccl
After setup
Reload your shell so the alias is available:
source ~/.zshrc # or source ~/.bashrc
Then run:
cc # Claude Code → local model
cx # Codex CLI → local model
Wizard Steps
graph TD
A[1. Discover environment] --> B[2. Install missing components]
B --> C[3. Pick harness + engine]
C --> D[4. Pick model]
D --> E[5. Smoke test engine]
E --> F[6. Wire harness]
F --> G[7. Install helper + aliases]
G --> H[8. Verify launch end-to-end]
H --> I[9. Generate guide.md]
See guide.example.md for the personalized daily-use guide the wizard generates.
Usage
ccl # run the interactive first-run wizard
ccl setup --harness claude --engine ollama # skip the prefs picker
ccl setup --non-interactive # CI-friendly install
ccl setup --resume # resume after a failure
ccl find-model # standalone model recommendation
ccl doctor # wizard state + presence check
ccl --version # print version and exit
Advanced / debug (no user binary — run as a Python module):
python -m claude_codex_local.core profile # full hardware profile as JSON
python -m claude_codex_local.core recommend # llmfit-only model recommendation
python -m claude_codex_local.core adapters # list all engine adapters
Prerequisites
- macOS or Linux with zsh or bash
- Python 3.10+
- At least one harness: Claude Code or Codex CLI
- At least one engine: Ollama (recommended), LM Studio, or llama.cpp
llmfitonPATH(optional — for automatic model selection)
Proven Paths
| Harness | Engine | Model | Status |
|---|---|---|---|
| Claude Code | Ollama | gemma4:26b |
Verified end-to-end |
| Codex CLI | Ollama | gemma4:26b |
Verified |
| Codex CLI | Ollama | qwen2.5-coder:0.5b |
Verified |
| Claude Code | LM Studio | Qwen3 family | Blocked — 400 thinking.type; wizard warns and recommends alternatives |
| Any | llama.cpp | any | Inline-env code path exists, no live proof yet |
Rollback
# Remove the fenced block from ~/.zshrc (between the marker lines)
rm -rf .claude-codex-local
That's it. Your ~/.claude and ~/.codex are unchanged.
Architecture details
Three layers
-
Machine profile + model recommendation (
claude_codex_local/core.py) — dumps a JSON snapshot of installed harnesses/engines/llmfit/disk, runsllmfitfor ranked model recommendations, and provides adoctorcommand for pretty-printing wizard state. -
Interactive wizard (
claude_codex_local/wizard.py) — 9 steps from discovery to ready-to-use daily alias. Persists progress in.claude-codex-local/wizard-state.jsonso--resumepicks up after a failure. -
Helper scripts + shell aliases —
.claude-codex-local/bin/cc(orcx) is a short bash wrapper. For Ollama it runsollama launch claude|codex --model <tag>. For LM Studio / llama.cpp it sets inline env vars and execs the real harness. A fenced block in~/.zshrc/~/.bashrcdeclares the aliases.
Why ollama launch
ollama launch claude --model <tag> is an official Ollama subcommand that sets the right env vars internally and execs the user's real claude binary against the local daemon — using ~/.claude as-is.
This means:
- No duplicated
~/.claudedirectory - No custom Modelfile or
ollama create - No
ANTHROPIC_CUSTOM_MODEL_OPTIONto manage manually ccjust works
Claude Code → LM Studio / llama.cpp env vars
| Env var | LM Studio | llama.cpp |
|---|---|---|
ANTHROPIC_BASE_URL |
http://localhost:1234 |
http://localhost:8001 |
ANTHROPIC_API_KEY |
lmstudio |
sk-local |
ANTHROPIC_CUSTOM_MODEL_OPTION |
<tag> |
<tag> |
ANTHROPIC_CUSTOM_MODEL_OPTION_NAME |
Local (lmstudio) <tag> |
Local (llamacpp) <tag> |
CLAUDE_CODE_ATTRIBUTION_HEADER |
"0" |
"0" |
CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC |
"1" |
"1" |
Codex CLI → Ollama
ollama launch codex --model <tag> -- --oss --local-provider=ollama
The --oss --local-provider=ollama flags are required after -- because Codex otherwise tries to route through the ChatGPT account and rejects non-OpenAI model names.
Qwen3 + Claude Code
Claude Code sends a thinking payload that Qwen3 reasoning models interpret as an unterminated <think> block. The wizard detects Qwen3 model names at pick time and recommends Gemma 3 or Qwen 2.5 Coder instead.
Project structure
.
├── claude_codex_local/
│ ├── __init__.py # Package metadata + __version__
│ ├── wizard.py # Interactive setup wizard + `ccl` CLI
│ └── core.py # Machine profile, engine adapters, llmfit bindings
├── scripts/
│ └── e2e_smoke.sh # End-to-end smoke test
├── docs/
│ ├── poc-wizard.md # 9-step wizard architecture
│ ├── poc-architecture.md # System design overview
│ ├── poc-bootstrap.md # Bootstrap / install flow
│ └── poc-proof.md # Design rationale
├── tests/ # pytest test suite
├── install.sh # One-command remote installer
└── pyproject.toml # Project metadata and tool config
Tech stack
| Layer | Tool |
|---|---|
| Language | Python 3.10+ |
| UI / prompts | questionary, rich |
| Linting | ruff |
| Type checking | mypy |
| Testing | pytest + pytest-cov |
| Security | bandit, detect-secrets |
| Pre-commit | pre-commit |
Local state
Everything written by the bridge goes under .claude-codex-local/. Override with CLAUDE_CODEX_LOCAL_STATE_DIR.
Contributing
Contributions are welcome. Read CONTRIBUTING.md before opening a PR.
For security issues, see SECURITY.md.
MIT — © 2024 Luong NGUYEN
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file claude_codex_local-0.6.0.tar.gz.
File metadata
- Download URL: claude_codex_local-0.6.0.tar.gz
- Upload date:
- Size: 62.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
29999fde1bcdc91c2ac125f2c513a79d6d486dff5e2f04d740f47d8683903d67
|
|
| MD5 |
69efada24b90f379a18fe49854ed4508
|
|
| BLAKE2b-256 |
68efe0c1af2a72ec01497bd348860adcef91562f9cb1ce15bd1278d40ba9ca92
|
File details
Details for the file claude_codex_local-0.6.0-py3-none-any.whl.
File metadata
- Download URL: claude_codex_local-0.6.0-py3-none-any.whl
- Upload date:
- Size: 38.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
54d5da24e9433c85703da4d3ed7e8355110c8021c3762061995675b4c9f284d8
|
|
| MD5 |
74c026f68dd8df7fbe89e001bded5111
|
|
| BLAKE2b-256 |
0e5616d7ee4c19bd84dda9bc26490371ffb00fedad16e37dd7991b085c405deb
|