A MITM proxy tool to intercept, analyze and log AI coding assistant communications with LLM APIs
Project description
Claude-Code-Inspector (CCI)
🔍 MITM Proxy for LLM API Traffic Analysis
A cross-platform command-line tool that intercepts, analyzes, and logs communications between AI coding assistants (Claude Code, Cursor, Codex, Gemini-CLI, etc.) and their backend LLM APIs.
✨ Features
- Transparent Inspection - See exactly what prompts are sent and what responses are received
- Streaming Support - Captures both streaming (SSE) and non-streaming API responses
- Multi-Provider - Works with Anthropic, OpenAI, Google, Groq, Together, Mistral, and more
- Automatic Masking - Protects API keys and sensitive data in logs
- JSONL Output - Structured data format for easy analysis and processing
- Stream Merger - Tool to consolidate streaming chunks into complete conversations
- Cross-Platform - Works on Windows, macOS, and Linux
📦 Installation
Using pip
pip install claude-code-inspector
Using uv (recommended)
uv add claude-code-inspector
From source
git clone https://github.com/your-repo/claude-code-inspector.git
cd claude-code-inspector
uv sync
🚀 Quick Start
1. Start the Proxy
cci capture --port 8080 --output my_trace.jsonl
2. Configure Your Application
Set the proxy environment variables before running your AI tool:
export HTTP_PROXY=http://127.0.0.1:8080
export HTTPS_PROXY=http://127.0.0.1:8080
Important for Node.js Applications (Claude Code, Cursor, etc.):
Node.js applications require the NODE_EXTRA_CA_CERTS environment variable to trust the mitmproxy CA certificate:
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
3. Run Your AI Tool
# Example with Claude Code (full configuration)
export HTTP_PROXY=http://127.0.0.1:8080
export HTTPS_PROXY=http://127.0.0.1:8080
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
claude -p "hello"
# Or Cursor, Codex, etc.
4. Stop Capture
Press Ctrl+C to stop the proxy.
5. (Optional) Merge and Split
# Merge streaming chunks into complete records
cci merge --input my_trace.jsonl --output merged.jsonl
# Split merged records into individual text files for analysis
cci split --input merged.jsonl --output-dir ./split_output
📖 Certificate Installation
To intercept HTTPS traffic, you need to install the mitmproxy CA certificate.
macOS
- Run
cci captureonce to generate the certificate - Open the certificate:
open ~/.mitmproxy/mitmproxy-ca-cert.pem - Double-click to add to Keychain
- In Keychain Access, find "mitmproxy"
- Double-click → Trust → "Always Trust"
Linux (Ubuntu/Debian)
# Generate certificate first
cci capture &
sleep 2
kill %1
# Install certificate
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /usr/local/share/ca-certificates/mitmproxy.crt
sudo update-ca-certificates
Linux (Fedora/RHEL)
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /etc/pki/ca-trust/source/anchors/
sudo update-ca-trust
Windows
- Run
cci captureonce to generate the certificate - Navigate to
%USERPROFILE%\.mitmproxy\ - Double-click
mitmproxy-ca-cert.pem - Click "Install Certificate"
- Select "Local Machine" → Next
- "Place all certificates in the following store"
- Browse → "Trusted Root Certification Authorities"
- Finish
Certificate Help Command
cci config --cert-help
📋 CLI Reference
cci capture
Start the proxy server and capture traffic.
cci capture [OPTIONS]
Options:
-p, --port INTEGER Proxy server port (default: 8080)
-h, --host TEXT Proxy server host (default: 127.0.0.1)
-o, --output PATH Output file path (default: cci_trace.jsonl)
--debug Enable debug mode with verbose logging
-i, --include TEXT Additional URL patterns to include (regex)
-e, --exclude TEXT URL patterns to exclude (regex)
Examples:
# Basic capture
cci capture
# Custom port and output
cci capture --port 9090 --output api_calls.jsonl
# Debug mode with custom filters
cci capture --debug --include ".*my-api\.com.*" --exclude ".*health.*"
cci merge
Merge streaming response chunks into complete records.
cci merge --input <file> --output <file>
Options:
-i, --input PATH Input JSONL file with raw streaming chunks [required]
-o, --output PATH Output JSONL file for merged records [required]
Example:
cci merge --input raw_trace.jsonl --output conversations.jsonl
cci split
Split merged JSONL into individual text files for analysis.
cci split --input <file> --output-dir <directory>
Options:
-i, --input PATH Input merged JSONL file [required]
-o, --output-dir PATH Output directory for split files (default: ./split_output)
Example:
cci split --input merged.jsonl --output-dir ./analysis
cci config
Display configuration and setup help.
cci config [OPTIONS]
Options:
--cert-help Show certificate installation instructions
--proxy-help Show proxy configuration instructions
--show Show current configuration
cci stats
Display statistics for a captured trace file.
cci stats <file>
Example:
cci stats my_trace.jsonl
📄 Output Format
CCI produces JSONL (JSON Lines) files with the following record types:
Request Record
{
"type": "request",
"id": "uuid-req-001",
"timestamp": "2024-11-25T10:00:00Z",
"method": "POST",
"url": "https://api.anthropic.com/v1/messages",
"headers": {"content-type": "application/json"},
"body": {"model": "claude-3-sonnet", "messages": [...]}
}
Response Chunk (Streaming)
{
"type": "response_chunk",
"request_id": "uuid-req-001",
"timestamp": "2024-11-25T10:00:01Z",
"status_code": 200,
"chunk_index": 0,
"content": {"type": "content_block_delta", "delta": {"text": "Hello"}}
}
Response Meta
{
"type": "response_meta",
"request_id": "uuid-req-001",
"total_latency_ms": 1500,
"status_code": 200,
"total_chunks": 42
}
Non-Streaming Response
{
"type": "response",
"request_id": "uuid-req-001",
"timestamp": "2024-11-25T10:00:01Z",
"status_code": 200,
"headers": {...},
"body": {...},
"latency_ms": 1500
}
Merged Record (after cci merge)
{
"request_id": "uuid-req-001",
"timestamp": "2024-11-25T10:00:00Z",
"method": "POST",
"url": "https://api.anthropic.com/v1/messages",
"request_body": {...},
"response_status": 200,
"response_text": "Hello! How can I help you today?",
"total_latency_ms": 1500,
"chunk_count": 42
}
⚙️ Configuration
CCI can be configured via TOML/YAML files or environment variables.
Configuration File
Create cci.toml in your current directory or ~/.config/cci/config.toml:
[proxy]
host = "127.0.0.1"
port = 8080
ssl_insecure = false
[filter]
include_patterns = [
".*api\\.anthropic\\.com.*",
".*api\\.openai\\.com.*",
".*generativelanguage\\.googleapis\\.com.*",
]
exclude_patterns = []
[masking]
mask_auth_headers = true
sensitive_headers = ["authorization", "x-api-key", "api-key"]
sensitive_body_fields = []
mask_pattern = "***MASKED***"
[storage]
output_file = "cci_trace.jsonl"
pretty_json = false
max_file_size_mb = 0 # 0 = no rotation
[logging]
level = "INFO"
log_file = "" # optional file path
Environment Variables
CCI_PROXY_HOST=127.0.0.1
CCI_PROXY_PORT=8080
CCI_OUTPUT_FILE=my_trace.jsonl
CCI_LOG_LEVEL=DEBUG
CCI_INCLUDE_PATTERNS=.*my-api\.com.*,.*other-api\.com.*
🔧 Supported LLM Providers
CCI is pre-configured to capture traffic from:
| Provider | API Domain |
|---|---|
| Anthropic | api.anthropic.com |
| OpenAI | api.openai.com |
generativelanguage.googleapis.com |
|
| Together | api.together.xyz |
| Groq | api.groq.com |
| Mistral | api.mistral.ai |
| Cohere | api.cohere.ai |
| DeepSeek | api.deepseek.com |
Add custom providers with --include or in the config file.
🐛 Troubleshooting
SSL Handshake Error
Problem: SSL: CERTIFICATE_VERIFY_FAILED
Solution:
- Ensure the mitmproxy CA certificate is installed
- Run
cci config --cert-helpfor instructions - For testing, some tools support
--insecureorverify=False
Node.js Apps Not Working (Claude Code, Cursor, etc.)
Problem: Requests hang or timeout when using Claude Code or other Node.js-based tools
Solution:
Node.js requires the NODE_EXTRA_CA_CERTS environment variable to trust custom CA certificates:
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
Make sure all three variables are set:
export HTTP_PROXY=http://127.0.0.1:8080
export HTTPS_PROXY=http://127.0.0.1:8080
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
Proxy Connection Refused
Problem: Connection refused when connecting through proxy
Solution:
- Ensure CCI is running:
cci capture - Check the port is correct:
--port 8080 - Check firewall settings
No Traffic Captured
Problem: CCI is running but no requests are logged
Solution:
- Verify proxy environment variables are set:
echo $HTTP_PROXY $HTTPS_PROXY
- Check URL filter patterns match your API:
cci config --show
- Add custom include pattern:
cci capture --include ".*your-api\.com.*"
High Memory Usage with Long Sessions
Problem: Memory grows during long capture sessions
Solution:
Configure log rotation in cci.toml:
[storage]
max_file_size_mb = 100
📜 License
MIT License
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📞 Support
- GitHub Issues: Report a bug
- Documentation: Read the docs
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file claude_code_inspector-1.1.0.tar.gz.
File metadata
- Download URL: claude_code_inspector-1.1.0.tar.gz
- Upload date:
- Size: 31.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6fa3dd0a2d9f71226f646e08eb929f2a5045ee8f8b6d79b0872c912c87ce1c60
|
|
| MD5 |
025458679d633adefff1d2c7898a5ebf
|
|
| BLAKE2b-256 |
f3ca97cb287e2030c571137444052a7c9a84b0411c6c395b698cea2077ff8735
|
Provenance
The following attestation bundles were made for claude_code_inspector-1.1.0.tar.gz:
Publisher:
publish.yml on chouzz/claude-code-inspector
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
claude_code_inspector-1.1.0.tar.gz -
Subject digest:
6fa3dd0a2d9f71226f646e08eb929f2a5045ee8f8b6d79b0872c912c87ce1c60 - Sigstore transparency entry: 731027057
- Sigstore integration time:
-
Permalink:
chouzz/claude-code-inspector@bea89b69c5810a61ce33f12fdf3613c0ae8b6ee6 -
Branch / Tag:
refs/tags/v1.1.0 - Owner: https://github.com/chouzz
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@bea89b69c5810a61ce33f12fdf3613c0ae8b6ee6 -
Trigger Event:
release
-
Statement type:
File details
Details for the file claude_code_inspector-1.1.0-py3-none-any.whl.
File metadata
- Download URL: claude_code_inspector-1.1.0-py3-none-any.whl
- Upload date:
- Size: 28.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8b24848e9cca90125b4f74ae853b4545b5b0d99f4a27c4973ef33f787159564e
|
|
| MD5 |
f6e222ed86f9bca8935ffcfaf14ab598
|
|
| BLAKE2b-256 |
54b240b5cdb1541f02f6f234017ca865ea05716257266069a13fe36c84306d97
|
Provenance
The following attestation bundles were made for claude_code_inspector-1.1.0-py3-none-any.whl:
Publisher:
publish.yml on chouzz/claude-code-inspector
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
claude_code_inspector-1.1.0-py3-none-any.whl -
Subject digest:
8b24848e9cca90125b4f74ae853b4545b5b0d99f4a27c4973ef33f787159564e - Sigstore transparency entry: 731027058
- Sigstore integration time:
-
Permalink:
chouzz/claude-code-inspector@bea89b69c5810a61ce33f12fdf3613c0ae8b6ee6 -
Branch / Tag:
refs/tags/v1.1.0 - Owner: https://github.com/chouzz
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@bea89b69c5810a61ce33f12fdf3613c0ae8b6ee6 -
Trigger Event:
release
-
Statement type: