OpenAI-compatible API proxy for Claude Max. Wraps Claude Code CLI — zero API keys needed.
Project description
cmappy
OpenAI-compatible API proxy for Claude Max subscribers. Wraps the Claude Code CLI as a subprocess and exposes a standard /v1/chat/completions endpoint — zero API keys, zero extra cost.
Your App (any OpenAI-compatible client)
↓
HTTP Request (OpenAI format)
↓
cmappy (FastAPI, port 3456)
↓
Claude Code CLI (subprocess)
↓
OAuth Token (from Max subscription)
↓
Anthropic API
↓
Response → OpenAI format → Your App
Prerequisites
- Claude Max subscription ($100 or $200/month)
- Claude Code CLI installed and authenticated:
npm install -g @anthropic-ai/claude-code claude auth login
- Python 3.10+ and uv
Quick start
# install (pick one)
uv tool install claude-max-proxy-py # uv
pip install claude-max-proxy-py # pip
# run
cmappy
The server starts on http://127.0.0.1:3456. Point any OpenAI-compatible client at it.
Install
# from PyPI
uv tool install claude-max-proxy-py # uv (recommended)
pip install claude-max-proxy-py # pip
# from source
git clone https://github.com/p-sumann/claude-max-proxy-py.git
cd claude-max-proxy
uv tool install . # or: pip install .
Uninstall
uv tool uninstall claude-max-proxy-py # uv
pip uninstall claude-max-proxy-py # pip
Usage
# default
cmappy
# custom port
cmappy --port 8080
# bind to all interfaces
cmappy --host 0.0.0.0
# skip auth check on startup
cmappy --skip-auth-check
# dev mode (auto-reload)
cmappy --reload
API
| Method | Endpoint | Description |
|---|---|---|
GET |
/health |
Health check |
GET |
/v1/models |
List available models |
POST |
/v1/chat/completions |
Chat completions (streaming + non-streaming) |
Models
| Model ID | CLI Alias |
|---|---|
claude-opus-4 |
opus |
claude-sonnet-4 |
sonnet |
claude-sonnet-5 |
sonnet |
claude-haiku-4 |
haiku |
curl
curl -X POST http://localhost:3456/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "claude-sonnet-4", "messages": [{"role": "user", "content": "Hello!"}]}'
Streaming
curl -N -X POST http://localhost:3456/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "claude-sonnet-4", "messages": [{"role": "user", "content": "Hello!"}], "stream": true}'
OpenAI Python SDK
from openai import OpenAI
client = OpenAI(base_url="http://localhost:3456/v1", api_key="not-needed")
response = client.chat.completions.create(
model="claude-sonnet-4",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)
How it works
The proxy spawns the claude CLI with these flags:
claude --print --output-format stream-json --verbose \
--include-partial-messages --model <alias> \
--no-session-persistence "<prompt>"
It reads stdout as NDJSON and classifies each line into events: content_block_delta (text chunks for SSE), assistant (model name), and result (final response with usage stats). These are converted on-the-fly into OpenAI-format responses.
Development
git clone https://github.com/p-sumann/claude-max-proxy-py.git
cd claude-max-proxy
# install with dev deps
uv sync --dev
# run without installing
uv run cmappy
# run tests
uv run pytest
# lint
uv run ruff check .
Contributing
Contributions are welcome! Please read the contributing guide before opening a PR.
Security
If you discover a security vulnerability, please see our security policy. Do not open a public issue for security reports.
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file claude_max_proxy_py-0.1.2.tar.gz.
File metadata
- Download URL: claude_max_proxy_py-0.1.2.tar.gz
- Upload date:
- Size: 69.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.17 {"installer":{"name":"uv","version":"0.9.17","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f70d82cded7ed01800650768a1149f699a9bbb012a000ca88e293489c6f1f8c
|
|
| MD5 |
19831d9b48b86d1949c8d0e3251281c8
|
|
| BLAKE2b-256 |
536e1f9f38746cbd3fb4ff90dc18ba88ab0c22c24649166337cdd68a72c15f93
|
File details
Details for the file claude_max_proxy_py-0.1.2-py3-none-any.whl.
File metadata
- Download URL: claude_max_proxy_py-0.1.2-py3-none-any.whl
- Upload date:
- Size: 17.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.17 {"installer":{"name":"uv","version":"0.9.17","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2d3496fae299648acc911fa0fca72dab7c40afcade4ce118395f0e84cb6e2fad
|
|
| MD5 |
4c23759e0666288ac210c1b0128ff924
|
|
| BLAKE2b-256 |
343f150b87e3033d2d5310fe0e5333f5c5f96d668f1152096192f6fc07569739
|