Intercept and analyze LLM traffic from AI coding tools
Project description
LLM Interceptor (LLI)
๐ Proxy-layer microscope for LLM traffic analysis
A cross-platform command-line tool that intercepts, analyzes, and logs communications between AI coding tools/agents (Claude Code, Cursor, Codex, Gemini-CLI, etc.) and their backend LLM APIs.
โจ Features
- Watch Mode - Interactive continuous capture with session management
- Transparent Inspection - See exactly what prompts are sent and what responses are received
- Streaming Support - Captures both streaming (SSE) and non-streaming API responses
- Multi-Provider - Works with Anthropic, OpenAI, Google, Groq, Together, Mistral, and more
- Automatic Masking - Protects API keys and sensitive data in logs
- Auto Processing - Automatically merges and splits session data
- Cross-Platform - Works on Windows, macOS, and Linux
๐ฆ Installation
Using pip
pip install llm-interceptor
Using uv (recommended)
uv add llm-interceptor
From source
git clone https://github.com/chouzz/llm-interceptor.git
cd llm-interceptor
uv sync
Note: This project was formerly named claude-code-inspector. The new canonical name is llm-interceptor.
๐ Quick Start
1. Install Certificate (For HTTPS Capture Only)
If you're only capturing HTTP traffic, you can skip this step. Only install the certificate if you need to capture HTTPS requests.
# Generate certificate
lli watch &
sleep 2
kill %1
Then install the certificate:
macOS:
open ~/.mitmproxy/mitmproxy-ca-cert.pem
# Double-click to add to Keychain
# In Keychain Access, find "mitmproxy" โ Double-click โ Trust โ "Always Trust"
Linux (Ubuntu/Debian):
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /usr/local/share/ca-certificates/mitmproxy.crt
sudo update-ca-certificates
Windows:
Navigate to %USERPROFILE%\.mitmproxy\, double-click mitmproxy-ca-cert.pem โ Install Certificate โ Local Machine โ Trusted Root Certification Authorities
2. Start Watch Mode and Record Sessions
lli watch
In watch mode:
- Press Enter to start recording a session
- Press Enter again to stop recording and automatically process the session
- Press Esc while recording to cancel the current session (no output generated)
- Ctrl+C to exit watch mode
3. Configure Your Application and Start Dialogue (New Terminal)
export HTTP_PROXY=http://127.0.0.1:9090
export HTTPS_PROXY=http://127.0.0.1:9090
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
# Run Claude and start your conversation
claude
# Now start your dialogue - all prompts and responses will be captured
4. Visualize with Web UI
The web interface should be launched in http://127.0.0.0.1:8000 to analyze captured conversations:
In the UI, you can:
- Browse captured sessions in the sidebar
- View conversation flow between requests and responses
- Inspect detailed API payloads and metadata
- Search and filter through captured data
- Copy formatted content for further analysis
๐ฌ How Watch Mode Works
Watch mode uses a state machine with three states:
| State | Description |
|---|---|
| IDLE | Monitoring traffic, waiting for you to start a session |
| RECORDING | Capturing traffic with session ID injection |
| PROCESSING | Auto-extracting, merging, and splitting session data |
Example Session
$ lli watch
โญโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ LLI Watch Mode โ
โ Continuous Capture โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Proxy Port: 9090
Output Dir: ./traces (or OS-specific logs directory)
Global Log: traces/all_captured_20251203_220000.jsonl
Configure your application:
export HTTP_PROXY=http://127.0.0.1:9090
export HTTPS_PROXY=http://127.0.0.1:9090
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
โ [IDLE] Monitoring on :9090... Logging to all_captured_20251203_220000.jsonl
Press [Enter] to START Session 1
<Enter>
โ [REC] Session 01_session_20251203_223010 is recording...
Press [Enter] to STOP & PROCESS, [Esc] to CANCEL
<Enter>
โณ [BUSY] Processing Session 01_session_20251203_223010...
โ Saved to traces/01_session_20251203_223010/
โ [IDLE] Monitoring on :9090... Logging to all_captured_20251203_220000.jsonl
Press [Enter] to START Session 2
Output Structure
./traces/ # Root output directory
โโโ all_captured_20251203_220000.jsonl # Global log (all traffic)
โ
โโโ 01_session_20251203_223010/ # Session 01 folder
โ โโโ raw.jsonl # Clean session data
โ โโโ merged.jsonl # Merged conversations
โ โโโ split_output/ # Individual files
โ โโโ 001_request_2025-12-03_22-30-10.json
โ โโโ 001_response_2025-12-03_22-30-10.json
โ
โโโ 02_session_20251203_224500/ # Session 02 folder
โโโ ...
๐ CLI Reference
lli watch
Start watch mode for continuous session capture (recommended).
lli watch [OPTIONS]
Options:
-p, --port INTEGER Proxy server port (default: 9090)
-o, --output-dir, --log-dir PATH Root output directory (default: ./traces or OS log dir)
-i, --include TEXT Additional URL patterns to include (glob pattern)
--debug Enable debug mode with verbose logging
Examples:
# Basic watch mode
lli watch
# Custom port and output directory
lli watch --port 8888 --output-dir ./my_traces
# Include custom API endpoint (glob pattern)
lli watch --include "*my-custom-api.com*"
# Match all subdomains of a domain
lli watch --include "*api.example.com*"
Glob Pattern Syntax:
| Pattern | Description |
|---|---|
* |
Matches any characters |
? |
Matches a single character |
[seq] |
Matches any character in seq |
[!seq] |
Matches any character not in seq |
lli config
Display configuration and setup help.
lli config --cert-help # Certificate installation instructions
lli config --proxy-help # Proxy configuration instructions
lli config --show # Show current configuration
lli stats
Display statistics for a captured trace file.
lli stats traces/01_session_xxx/raw.jsonl
๐ง Supported LLM Providers
LLI is pre-configured to capture traffic from:
| Provider | API Domain |
|---|---|
| Anthropic | api.anthropic.com |
| OpenAI | api.openai.com |
generativelanguage.googleapis.com |
|
| Together | api.together.xyz |
| Groq | api.groq.com |
| Mistral | api.mistral.ai |
| Cohere | api.cohere.ai |
| DeepSeek | api.deepseek.com |
Add custom providers with --include (using glob patterns):
lli watch --include "*my-custom-api.com*"
๐ Troubleshooting
SSL Certificate Error
Problem: SSL: CERTIFICATE_VERIFY_FAILED
Solution: Install the mitmproxy CA certificate. Run lli config --cert-help for instructions.
Node.js Apps Not Working
Problem: Requests hang or timeout when using Claude Code, Cursor, etc.
Solution: Set the NODE_EXTRA_CA_CERTS environment variable:
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
No Traffic Captured
Problem: Watch mode is running but no requests are logged
Solution:
- Verify proxy environment variables are set correctly
- Make sure the URL matches the default patterns (or add
--include) - Check
lli config --showto see current filter patterns
๐ License
MIT License
๐ค Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
๐ Support
- GitHub Issues: Report a bug
- Documentation: Read the docs
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_interceptor-2.3.1.tar.gz.
File metadata
- Download URL: llm_interceptor-2.3.1.tar.gz
- Upload date:
- Size: 6.2 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b46326ef4c87605407d4715db4f0e297e87288ea76eeed265cbd067d1d3e8081
|
|
| MD5 |
af134d79b135685a35be70704a06c276
|
|
| BLAKE2b-256 |
aa2205311d894996acaec30934c2dacc5b4647ea17f870ade495194e35f971ca
|
Provenance
The following attestation bundles were made for llm_interceptor-2.3.1.tar.gz:
Publisher:
publish.yml on chouzz/llm-interceptor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_interceptor-2.3.1.tar.gz -
Subject digest:
b46326ef4c87605407d4715db4f0e297e87288ea76eeed265cbd067d1d3e8081 - Sigstore transparency entry: 844640641
- Sigstore integration time:
-
Permalink:
chouzz/llm-interceptor@c37a532ebeecc02864b89e0bf1f053e03edb4e46 -
Branch / Tag:
refs/tags/v2.3.1 - Owner: https://github.com/chouzz
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@c37a532ebeecc02864b89e0bf1f053e03edb4e46 -
Trigger Event:
release
-
Statement type:
File details
Details for the file llm_interceptor-2.3.1-py3-none-any.whl.
File metadata
- Download URL: llm_interceptor-2.3.1-py3-none-any.whl
- Upload date:
- Size: 45.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
610dcbb2190c9f5d95c71794f66ec8e39b3da5442cd499fbcdba376d851cbb8e
|
|
| MD5 |
ca108619f4768ed3cee327517e982c09
|
|
| BLAKE2b-256 |
d166ef3c51b5d7f35833447d74ea824201afbfe1d34a65a106ff51b95c55f817
|
Provenance
The following attestation bundles were made for llm_interceptor-2.3.1-py3-none-any.whl:
Publisher:
publish.yml on chouzz/llm-interceptor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_interceptor-2.3.1-py3-none-any.whl -
Subject digest:
610dcbb2190c9f5d95c71794f66ec8e39b3da5442cd499fbcdba376d851cbb8e - Sigstore transparency entry: 844640642
- Sigstore integration time:
-
Permalink:
chouzz/llm-interceptor@c37a532ebeecc02864b89e0bf1f053e03edb4e46 -
Branch / Tag:
refs/tags/v2.3.1 - Owner: https://github.com/chouzz
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@c37a532ebeecc02864b89e0bf1f053e03edb4e46 -
Trigger Event:
release
-
Statement type: