Skip to main content

Intercept and analyze LLM traffic from AI coding tools

Project description

LLM Interceptor (LLI)

๐Ÿ” Proxy-layer microscope for LLM traffic analysis

A cross-platform command-line tool that intercepts, analyzes, and logs communications between AI coding tools/agents (Claude Code, Cursor, Codex, Gemini-CLI, etc.) and their backend LLM APIs.


CCI Web UI

โœจ Features

  • Watch Mode - Interactive continuous capture with session management
  • Transparent Inspection - See exactly what prompts are sent and what responses are received
  • Streaming Support - Captures both streaming (SSE) and non-streaming API responses
  • Multi-Provider - Works with Anthropic, OpenAI, Google, Groq, Together, Mistral, and more
  • Automatic Masking - Protects API keys and sensitive data in logs
  • Auto Processing - Automatically merges and splits session data
  • Cross-Platform - Works on Windows, macOS, and Linux

๐Ÿ“ฆ Installation

Using pip

pip install llm-interceptor

Using uv (recommended)

uv add llm-interceptor

From source

git clone https://github.com/chouzz/llm-interceptor.git
cd llm-interceptor
uv sync

Note: This project was formerly named claude-code-inspector. The new canonical name is llm-interceptor.

๐Ÿš€ Quick Start

1. Install Certificate (For HTTPS Capture Only)

If you're only capturing HTTP traffic, you can skip this step. Only install the certificate if you need to capture HTTPS requests.

# Generate certificate
lli watch &
sleep 2
kill %1

Then install the certificate:

macOS:

open ~/.mitmproxy/mitmproxy-ca-cert.pem
# Double-click to add to Keychain
# In Keychain Access, find "mitmproxy" โ†’ Double-click โ†’ Trust โ†’ "Always Trust"

Linux (Ubuntu/Debian):

sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /usr/local/share/ca-certificates/mitmproxy.crt
sudo update-ca-certificates

Windows: Navigate to %USERPROFILE%\.mitmproxy\, double-click mitmproxy-ca-cert.pem โ†’ Install Certificate โ†’ Local Machine โ†’ Trusted Root Certification Authorities

2. Start Watch Mode and Record Sessions

lli watch

In watch mode:

  • Press Enter to start recording a session
  • Press Enter again to stop recording and automatically process the session
  • Press Esc while recording to cancel the current session (no output generated)
  • Ctrl+C to exit watch mode

3. Configure Your Application and Start Dialogue (New Terminal)

export HTTP_PROXY=http://127.0.0.1:9090
export HTTPS_PROXY=http://127.0.0.1:9090
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem

# Run Claude and start your conversation
claude
# Now start your dialogue - all prompts and responses will be captured

4. Visualize with Web UI

The web interface should be launched in http://127.0.0.0.1:8000 to analyze captured conversations:

In the UI, you can:

  • Browse captured sessions in the sidebar
  • View conversation flow between requests and responses
  • Inspect detailed API payloads and metadata
  • Search and filter through captured data
  • Copy formatted content for further analysis

๐ŸŽฌ How Watch Mode Works

Watch mode uses a state machine with three states:

State Description
IDLE Monitoring traffic, waiting for you to start a session
RECORDING Capturing traffic with session ID injection
PROCESSING Auto-extracting, merging, and splitting session data

Example Session

$ lli watch

โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
โ”‚   LLI Watch Mode        โ”‚
โ”‚ Continuous Capture      โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ

  Proxy Port:    9090
  Output Dir:    ./traces
  Global Log:    traces/all_captured_20251203_220000.jsonl

Configure your application:
  export HTTP_PROXY=http://127.0.0.1:9090
  export HTTPS_PROXY=http://127.0.0.1:9090
  export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem

โ— [IDLE] Monitoring on :9090... Logging to all_captured_20251203_220000.jsonl
  Press [Enter] to START Session 1
<Enter>

โ—‰ [REC] Session 01_session_20251203_223010 is recording...
  Press [Enter] to STOP & PROCESS, [Esc] to CANCEL
<Enter>

โณ [BUSY] Processing Session 01_session_20251203_223010...
  โœ” Saved to traces/01_session_20251203_223010/

โ— [IDLE] Monitoring on :9090... Logging to all_captured_20251203_220000.jsonl
  Press [Enter] to START Session 2

Output Structure

./traces/                                    # Root output directory
โ”œโ”€โ”€ all_captured_20251203_220000.jsonl       # Global log (all traffic)
โ”‚
โ”œโ”€โ”€ 01_session_20251203_223010/              # Session 01 folder
โ”‚   โ”œโ”€โ”€ raw.jsonl                            # Clean session data
โ”‚   โ”œโ”€โ”€ merged.jsonl                         # Merged conversations
โ”‚   โ””โ”€โ”€ split_output/                        # Individual files
โ”‚       โ”œโ”€โ”€ 001_request_2025-12-03_22-30-10.json
โ”‚       โ””โ”€โ”€ 001_response_2025-12-03_22-30-10.json
โ”‚
โ”œโ”€โ”€ 02_session_20251203_224500/              # Session 02 folder
โ””โ”€โ”€ ...

๐Ÿ“‹ CLI Reference

lli watch

Start watch mode for continuous session capture (recommended).

lli watch [OPTIONS]

Options:
  -p, --port INTEGER       Proxy server port (default: 9090)
  -o, --output-dir PATH    Root output directory (default: ./traces)
  -i, --include TEXT       Additional URL patterns to include (glob pattern)
  --debug                  Enable debug mode with verbose logging

Examples:

# Basic watch mode
lli watch

# Custom port and output directory
lli watch --port 8888 --output-dir ./my_traces

# Include custom API endpoint (glob pattern)
lli watch --include "*my-custom-api.com*"

# Match all subdomains of a domain
lli watch --include "*api.example.com*"

Glob Pattern Syntax:

Pattern Description
* Matches any characters
? Matches a single character
[seq] Matches any character in seq
[!seq] Matches any character not in seq

lli config

Display configuration and setup help.

lli config --cert-help    # Certificate installation instructions
lli config --proxy-help   # Proxy configuration instructions
lli config --show         # Show current configuration

lli stats

Display statistics for a captured trace file.

lli stats traces/01_session_xxx/raw.jsonl

๐Ÿ”ง Supported LLM Providers

LLI is pre-configured to capture traffic from:

Provider API Domain
Anthropic api.anthropic.com
OpenAI api.openai.com
Google generativelanguage.googleapis.com
Together api.together.xyz
Groq api.groq.com
Mistral api.mistral.ai
Cohere api.cohere.ai
DeepSeek api.deepseek.com

Add custom providers with --include (using glob patterns):

lli watch --include "*my-custom-api.com*"

๐Ÿ› Troubleshooting

SSL Certificate Error

Problem: SSL: CERTIFICATE_VERIFY_FAILED

Solution: Install the mitmproxy CA certificate. Run lli config --cert-help for instructions.

Node.js Apps Not Working

Problem: Requests hang or timeout when using Claude Code, Cursor, etc.

Solution: Set the NODE_EXTRA_CA_CERTS environment variable:

export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem

No Traffic Captured

Problem: Watch mode is running but no requests are logged

Solution:

  1. Verify proxy environment variables are set correctly
  2. Make sure the URL matches the default patterns (or add --include)
  3. Check lli config --show to see current filter patterns

๐Ÿ“œ License

MIT License

๐Ÿค Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

๐Ÿ“ž Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_interceptor-2.1.0.tar.gz (6.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_interceptor-2.1.0-py3-none-any.whl (42.8 kB view details)

Uploaded Python 3

File details

Details for the file llm_interceptor-2.1.0.tar.gz.

File metadata

  • Download URL: llm_interceptor-2.1.0.tar.gz
  • Upload date:
  • Size: 6.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_interceptor-2.1.0.tar.gz
Algorithm Hash digest
SHA256 227581d2b1e477b8be42784719b717b59cde5de19f4d189fe7a1e0ca62b99d92
MD5 c20b8d8e3b875118e5e794618e306c71
BLAKE2b-256 a7e2423a958d0f99204c4f2f5faba0e6144d3d61d82716a7b0e83248cf4ed2dc

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_interceptor-2.1.0.tar.gz:

Publisher: publish.yml on chouzz/llm-interceptor

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_interceptor-2.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_interceptor-2.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6c117631f2cd6a717c124d3f14e006ff64a33dc031ac1a19d0011ebd9df02b0c
MD5 bb57a935cd5290777e2198b1cf1efce7
BLAKE2b-256 5c1e6b2d486f5440c5f235144875c9f14cd98b56973124a9f25c6cd1d38df576

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_interceptor-2.1.0-py3-none-any.whl:

Publisher: publish.yml on chouzz/llm-interceptor

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page