Skip to main content

API server that provides an Anthropic and OpenAI compatible interface over Claude Code, allowing to use your Claude OAuth account or over the API.

Project description

CCProxy API Server

ccproxy is a local reverse proxy server for Anthropic Claude LLM at api.anthropic.com/v1/messages. It allows you to use your existing Claude Max subscription to interact with the Anthropic API, bypassing the need for separate API key billing.

The server provides two primary modes of operation:

  • SDK Mode (/sdk): Routes requests through the local claude-code-sdk. This enables access to tools configured in your Claude environment.
  • API Mode (/api): Acts as a direct reverse proxy, injecting the necessary authentication headers. This provides full access to the underlying API features and model settings.

It includes a translation layer to support both Anthropic and OpenAI-compatible API formats for requests and responses, including streaming.

Installation

# The official claude-code CLI is required for SDK mode
npm install -g @anthropic-ai/claude-code

# run it with uv
uvx ccproxy-api

# run it with pipx
pipx run ccproxy-api

# install with uv
uv tool install ccproxy-api

# Install ccproxy with pip
pipx install ccproxy-api

# Optional: Enable shell completion
eval "$(ccproxy --show-completion zsh)"  # For zsh
eval "$(ccproxy --show-completion bash)" # For bash

For dev version replace ccproxy-api with git+https://github.com/caddyglow/ccproxy-api.git@dev

Authentication

The proxy uses two different authentication mechanisms depending on the mode.

  1. Claude CLI (sdk mode): This mode relies on the authentication handled by the claude-code-sdk.

    claude /login
    

    It's also possible now to get a long live token to avoid renewing issues using

    ```bash
    claude setup-token`
    
  2. ccproxy (api mode): This mode uses its own OAuth2 flow to obtain credentials for direct API access.

    ccproxy auth login
    

    If you are already connected with Claude CLI the credentials should be found automatically

You can check the status of these credentials with ccproxy auth validate and ccproxy auth info.

Warning is show on start up if no credentials are setup.

Usage

Running the Server

# Start the proxy server
ccproxy

The server will start on http://127.0.0.1:8000 by default.

Client Configuration

Point your existing tools and applications to the local proxy instance by setting the appropriate environment variables. A dummy API key is required by most client libraries but is not used by the proxy itself.

For OpenAI-compatible clients:

# For SDK mode
export OPENAI_BASE_URL="http://localhost:8000/sdk/v1"
# For API mode
export OPENAI_BASE_URL="http://localhost:8000/api/v1"

export OPENAI_API_KEY="dummy-key"

For Anthropic-compatible clients:

# For SDK mode
export ANTHROPIC_BASE_URL="http://localhost:8000/sdk"
# For API mode
export ANTHROPIC_BASE_URL="http://localhost:8000/api"

export ANTHROPIC_API_KEY="dummy-key"

Using with Aider

CCProxy works seamlessly with Aider and other AI coding assistants:

Anthropic Mode

export ANTHROPIC_API_KEY=dummy
export ANTHROPIC_BASE_URL=http://127.0.0.1:8000/api
aider --model claude-sonnet-4-20250514

OpenAI Mode with Model Mapping

If your tool only supports OpenAI settings, ccproxy automatically maps OpenAI models to Claude:

export OPENAI_API_KEY=dummy
export OPENAI_BASE_URL=http://127.0.0.1:8000/api/v1
aider --model o3-mini

API Mode (Direct Proxy)

For minimal interference and direct API access:

export OPENAI_API_KEY=dummy
export OPENAI_BASE_URL=http://127.0.0.1:8000/api/v1
aider --model o3-mini

curl Example

# SDK mode
curl -X POST http://localhost:8000/sdk/v1/messages \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-3-5-sonnet-20241022",
    "messages": [{"role": "user", "content": "Hello!"}],
    "max_tokens": 100
  }'

# API mode
curl -X POST http://localhost:8000/api/v1/messages \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-3-5-sonnet-20241022",
    "messages": [{"role": "user", "content": "Hello!"}],
    "max_tokens": 100
  }'

More examples are available in the examples/ directory.

Endpoints

The proxy exposes endpoints under two prefixes, corresponding to its operating modes.

Mode URL Prefix Description Use Case
SDK /sdk/ Uses claude-code-sdk with its configured tools. Accessing Claude with local tools.
API /api/ Direct proxy with header injection. Full API control, direct access.
  • Anthropic:
    • POST /sdk/v1/messages
    • POST /api/v1/messages
  • OpenAI-Compatible:
    • POST /sdk/v1/chat/completions
    • POST /api/v1/chat/completions
  • Utility:
    • GET /health
    • GET /sdk/models, GET /api/models
    • GET /sdk/status, GET /api/status
    • GET /oauth/callback
  • Observability (Optional):
    • GET /metrics
    • GET /logs/status, GET /logs/query
    • GET /dashboard

Supported Models

CCProxy supports recent Claude models including Opus, Sonnet, and Haiku variants. The specific models available to you will depend on your Claude account and the features enabled for your subscription.

  • claude-opus-4-20250514
  • claude-sonnet-4-20250514
  • claude-3-7-sonnet-20250219
  • claude-3-5-sonnet-20241022
  • claude-3-5-sonnet-20240620

Configuration

Settings can be configured through (in order of precedence):

  1. Command-line arguments
  2. Environment variables
  3. .env file
  4. TOML configuration files (.ccproxy.toml, ccproxy.toml, or ~/.config/ccproxy/config.toml)
  5. Default values

For complex configurations, you can use a nested syntax for environment variables with __ as a delimiter:

# Server settings
SERVER__HOST=0.0.0.0
SERVER__PORT=8080
# etc.

Securing the Proxy (Optional)

You can enable token authentication for the proxy. This supports multiple header formats (x-api-key for Anthropic, Authorization: Bearer for OpenAI) for compatibility with standard client libraries.

1. Generate a Token:

ccproxy generate-token
# Output: SECURITY__AUTH_TOKEN=abc123xyz789...

2. Configure the Token:

# Set environment variable
export SECURITY__AUTH_TOKEN=abc123xyz789...

# Or add to .env file
echo "SECURITY__AUTH_TOKEN=abc123xyz789..." >> .env

3. Use in Requests: When authentication is enabled, include the token in your API requests.

# Anthropic Format (x-api-key)
curl -H "x-api-key: your-token" ...

# OpenAI/Bearer Format
curl -H "Authorization: Bearer your-token" ...

Observability

ccproxy includes an optional but powerful observability suite for monitoring and analytics. When enabled, it provides:

  • Prometheus Metrics: A /metrics endpoint for real-time operational monitoring.
  • Access Log Storage: Detailed request logs, including token usage and costs, are stored in a local DuckDB database.
  • Analytics API: Endpoints to query and analyze historical usage data.
  • Real-time Dashboard: A live web interface at /dashboard to visualize metrics and request streams.

These features are disabled by default and can be enabled via configuration. For a complete guide on setting up and using these features, see the Observability Documentation.

Troubleshooting

Common Issues

  1. Authentication Error: Ensure you're using the correct mode (/sdk or /api) for your authentication method.
  2. Claude Credentials Expired: Run ccproxy auth login to refresh credentials for API mode. Run claude /login for SDK mode.
  3. Missing API Auth Token: If you've enabled security, include the token in your request headers.
  4. Port Already in Use: Start the server on a different port: ccproxy --port 8001.
  5. Model Not Available: Check that your Claude subscription includes the requested model.

Contributing

Please see CONTRIBUTING.md for details.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Documentation

Support

Acknowledgments

  • Anthropic for Claude and the Claude Code SDK
  • The open-source community

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ccproxy_api-0.1.1.tar.gz (605.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ccproxy_api-0.1.1-py3-none-any.whl (263.3 kB view details)

Uploaded Python 3

File details

Details for the file ccproxy_api-0.1.1.tar.gz.

File metadata

  • Download URL: ccproxy_api-0.1.1.tar.gz
  • Upload date:
  • Size: 605.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for ccproxy_api-0.1.1.tar.gz
Algorithm Hash digest
SHA256 a66c44eb4451203e6d8f18996e1daa82f75314019d0ab0d56d2136e72035a3e2
MD5 51665e3c2760f45ce958e4369141ca92
BLAKE2b-256 a6731e9913a92e498e4b27ae38e927e4ced6c197c0ffd20c8cceda95f54a7095

See more details on using hashes here.

Provenance

The following attestation bundles were made for ccproxy_api-0.1.1.tar.gz:

Publisher: release.yml on CaddyGlow/ccproxy-api

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ccproxy_api-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: ccproxy_api-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 263.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for ccproxy_api-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f44e723bb3abf281b96f437dbd99d2c64c138edb1c46f87519e74e8f438ebc52
MD5 280cae2ac948319cce8422e84300423e
BLAKE2b-256 1fea3ab3d4f39f6e137c091b812b24998958ea8e7e865e7921700126808bd09e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ccproxy_api-0.1.1-py3-none-any.whl:

Publisher: release.yml on CaddyGlow/ccproxy-api

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page