Skip to main content

A MITM proxy tool to intercept, analyze and log AI coding assistant communications with LLM APIs

Project description

Claude-Code-Inspector (CCI)

🔍 MITM Proxy for LLM API Traffic Analysis

A cross-platform command-line tool that intercepts, analyzes, and logs communications between AI coding assistants (Claude Code, Cursor, Codex, Gemini-CLI, etc.) and their backend LLM APIs.


✨ Features

  • Transparent Inspection - See exactly what prompts are sent and what responses are received
  • Streaming Support - Captures both streaming (SSE) and non-streaming API responses
  • Multi-Provider - Works with Anthropic, OpenAI, Google, Groq, Together, Mistral, and more
  • Automatic Masking - Protects API keys and sensitive data in logs
  • JSONL Output - Structured data format for easy analysis and processing
  • Stream Merger - Tool to consolidate streaming chunks into complete conversations
  • Cross-Platform - Works on Windows, macOS, and Linux

📦 Installation

Using pip

pip install claude-code-inspector

Using uv (recommended)

uv add claude-code-inspector

From source

git clone https://github.com/your-repo/claude-code-inspector.git
cd claude-code-inspector
uv sync

🚀 Quick Start

1. Start the Proxy

cci capture --port 8080 --output my_trace.jsonl

2. Configure Your Application

Set the proxy environment variables before running your AI tool:

export HTTP_PROXY=http://127.0.0.1:8080
export HTTPS_PROXY=http://127.0.0.1:8080

3. Run Your AI Tool

# Example with Claude Code
claude -p "hello"

# Or Cursor, Codex, etc.

4. Stop Capture

Press Ctrl+C to stop the proxy.

5. (Optional) Merge Streaming Chunks

cci merge --input my_trace.jsonl --output merged.jsonl

📖 Certificate Installation

To intercept HTTPS traffic, you need to install the mitmproxy CA certificate.

macOS

  1. Run cci capture once to generate the certificate
  2. Open the certificate:
    open ~/.mitmproxy/mitmproxy-ca-cert.pem
    
  3. Double-click to add to Keychain
  4. In Keychain Access, find "mitmproxy"
  5. Double-click → Trust → "Always Trust"

Linux (Ubuntu/Debian)

# Generate certificate first
cci capture &
sleep 2
kill %1

# Install certificate
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /usr/local/share/ca-certificates/mitmproxy.crt
sudo update-ca-certificates

Linux (Fedora/RHEL)

sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /etc/pki/ca-trust/source/anchors/
sudo update-ca-trust

Windows

  1. Run cci capture once to generate the certificate
  2. Navigate to %USERPROFILE%\.mitmproxy\
  3. Double-click mitmproxy-ca-cert.pem
  4. Click "Install Certificate"
  5. Select "Local Machine" → Next
  6. "Place all certificates in the following store"
  7. Browse → "Trusted Root Certification Authorities"
  8. Finish

Certificate Help Command

cci config --cert-help

📋 CLI Reference

cci capture

Start the proxy server and capture traffic.

cci capture [OPTIONS]

Options:
  -p, --port INTEGER     Proxy server port (default: 8080)
  -h, --host TEXT        Proxy server host (default: 127.0.0.1)
  -o, --output PATH      Output file path (default: cci_trace.jsonl)
  --debug                Enable debug mode with verbose logging
  -i, --include TEXT     Additional URL patterns to include (regex)
  -e, --exclude TEXT     URL patterns to exclude (regex)

Examples:

# Basic capture
cci capture

# Custom port and output
cci capture --port 9090 --output api_calls.jsonl

# Debug mode with custom filters
cci capture --debug --include ".*my-api\.com.*" --exclude ".*health.*"

cci merge

Merge streaming response chunks into complete records.

cci merge --input <file> --output <file>

Options:
  -i, --input PATH   Input JSONL file with raw streaming chunks [required]
  -o, --output PATH  Output JSONL file for merged records [required]

Example:

cci merge --input raw_trace.jsonl --output conversations.jsonl

cci config

Display configuration and setup help.

cci config [OPTIONS]

Options:
  --cert-help    Show certificate installation instructions
  --proxy-help   Show proxy configuration instructions
  --show         Show current configuration

cci stats

Display statistics for a captured trace file.

cci stats <file>

Example:

cci stats my_trace.jsonl

📄 Output Format

CCI produces JSONL (JSON Lines) files with the following record types:

Request Record

{
  "type": "request",
  "id": "uuid-req-001",
  "timestamp": "2024-11-25T10:00:00Z",
  "method": "POST",
  "url": "https://api.anthropic.com/v1/messages",
  "headers": {"content-type": "application/json"},
  "body": {"model": "claude-3-sonnet", "messages": [...]}
}

Response Chunk (Streaming)

{
  "type": "response_chunk",
  "request_id": "uuid-req-001",
  "timestamp": "2024-11-25T10:00:01Z",
  "status_code": 200,
  "chunk_index": 0,
  "content": {"type": "content_block_delta", "delta": {"text": "Hello"}}
}

Response Meta

{
  "type": "response_meta",
  "request_id": "uuid-req-001",
  "total_latency_ms": 1500,
  "status_code": 200,
  "total_chunks": 42
}

Non-Streaming Response

{
  "type": "response",
  "request_id": "uuid-req-001",
  "timestamp": "2024-11-25T10:00:01Z",
  "status_code": 200,
  "headers": {...},
  "body": {...},
  "latency_ms": 1500
}

Merged Record (after cci merge)

{
  "request_id": "uuid-req-001",
  "timestamp": "2024-11-25T10:00:00Z",
  "method": "POST",
  "url": "https://api.anthropic.com/v1/messages",
  "request_body": {...},
  "response_status": 200,
  "response_text": "Hello! How can I help you today?",
  "total_latency_ms": 1500,
  "chunk_count": 42
}

⚙️ Configuration

CCI can be configured via TOML/YAML files or environment variables.

Configuration File

Create cci.toml in your current directory or ~/.config/cci/config.toml:

[proxy]
host = "127.0.0.1"
port = 8080
ssl_insecure = false

[filter]
include_patterns = [
    ".*api\\.anthropic\\.com.*",
    ".*api\\.openai\\.com.*",
    ".*generativelanguage\\.googleapis\\.com.*",
]
exclude_patterns = []

[masking]
mask_auth_headers = true
sensitive_headers = ["authorization", "x-api-key", "api-key"]
sensitive_body_fields = []
mask_pattern = "***MASKED***"

[storage]
output_file = "cci_trace.jsonl"
pretty_json = false
max_file_size_mb = 0  # 0 = no rotation

[logging]
level = "INFO"
log_file = ""  # optional file path

Environment Variables

CCI_PROXY_HOST=127.0.0.1
CCI_PROXY_PORT=8080
CCI_OUTPUT_FILE=my_trace.jsonl
CCI_LOG_LEVEL=DEBUG
CCI_INCLUDE_PATTERNS=.*my-api\.com.*,.*other-api\.com.*

🔧 Supported LLM Providers

CCI is pre-configured to capture traffic from:

Provider API Domain
Anthropic api.anthropic.com
OpenAI api.openai.com
Google generativelanguage.googleapis.com
Together api.together.xyz
Groq api.groq.com
Mistral api.mistral.ai
Cohere api.cohere.ai
DeepSeek api.deepseek.com

Add custom providers with --include or in the config file.

🐛 Troubleshooting

SSL Handshake Error

Problem: SSL: CERTIFICATE_VERIFY_FAILED

Solution:

  1. Ensure the mitmproxy CA certificate is installed
  2. Run cci config --cert-help for instructions
  3. For testing, some tools support --insecure or verify=False

Proxy Connection Refused

Problem: Connection refused when connecting through proxy

Solution:

  1. Ensure CCI is running: cci capture
  2. Check the port is correct: --port 8080
  3. Check firewall settings

No Traffic Captured

Problem: CCI is running but no requests are logged

Solution:

  1. Verify proxy environment variables are set:
    echo $HTTP_PROXY $HTTPS_PROXY
    
  2. Check URL filter patterns match your API:
    cci config --show
    
  3. Add custom include pattern:
    cci capture --include ".*your-api\.com.*"
    

High Memory Usage with Long Sessions

Problem: Memory grows during long capture sessions

Solution: Configure log rotation in cci.toml:

[storage]
max_file_size_mb = 100

📜 License

MIT License

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📞 Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

claude_code_inspector-1.0.0.tar.gz (21.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

claude_code_inspector-1.0.0-py3-none-any.whl (22.8 kB view details)

Uploaded Python 3

File details

Details for the file claude_code_inspector-1.0.0.tar.gz.

File metadata

  • Download URL: claude_code_inspector-1.0.0.tar.gz
  • Upload date:
  • Size: 21.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for claude_code_inspector-1.0.0.tar.gz
Algorithm Hash digest
SHA256 f7393b15e0a86ec6f43f83dd32862e2534c26e88761f3ead1a4d34f0d8b81e71
MD5 61deb2c07048aba41327ae35acb1f31d
BLAKE2b-256 bf30f66648e9d0cd42624ab400caf8a65e9e458ce1b5157ec71ac679a2b262a0

See more details on using hashes here.

Provenance

The following attestation bundles were made for claude_code_inspector-1.0.0.tar.gz:

Publisher: publish.yml on chouzz/claude-code-inspector

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file claude_code_inspector-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for claude_code_inspector-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c8bd07dce16d4297784b50252c3deb464e788e3d0773c8d4a120b099b425cc83
MD5 222dedee6aafbb858c7d9d38159e07e5
BLAKE2b-256 095c2004fbd2016d3a537f4be5063bac362ecc95ebb79c5ba3ed96eb7c817284

See more details on using hashes here.

Provenance

The following attestation bundles were made for claude_code_inspector-1.0.0-py3-none-any.whl:

Publisher: publish.yml on chouzz/claude-code-inspector

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page