Skip to main content

Intercept and analyze LLM traffic from AI coding tools

Project description

LLM Interceptor (LLI)

๐Ÿ” Proxy-layer microscope for LLM traffic analysis

A cross-platform command-line tool that intercepts, analyzes, and logs communications between AI coding tools/agents (Claude Code, Cursor, Codex, OpenCode, etc.) and their backend LLM APIs.


LLI Web UI

โœจ Features

  • Watch Mode - Interactive continuous capture with session management
  • Transparent Inspection - See exactly what prompts are sent and what responses are received
  • Streaming Support - Captures both streaming (SSE) and non-streaming API responses
  • Multi-Provider - Works with Anthropic, OpenAI, Google, Groq, Together, Mistral, and more
  • Automatic Masking - Protects API keys and sensitive data in logs
  • Auto Processing - Automatically merges and splits session data
  • Cross-Platform - Works on Windows, macOS, and Linux

๐Ÿ“ฆ Installation

Using uv (recommended)

uv tool install llm-interceptor

Using pip

pip install llm-interceptor

From source

git clone https://github.com/chouzz/llm-interceptor.git
cd llm-interceptor
uv sync --dev
uv run lli-dev-setup

Development setup

Git does not copy hooks from .git/hooks when you clone a repository, so each new clone must install the project's pre-commit hook once:

uv sync --dev
uv run lli-dev-setup

If you prefer pip:

pip install -e .[dev]
lli-dev-setup

๐Ÿš€ Quick Start

1. Install Certificate (For HTTPS Capture Only)

If you're only capturing HTTP traffic, you can skip this step. Only install the certificate if you need to capture HTTPS requests.

# Generate certificate
lli watch &
sleep 2
kill %1

Then install the certificate:

macOS:

open ~/.mitmproxy/mitmproxy-ca-cert.pem
# Double-click to add to Keychain
# In Keychain Access, find "mitmproxy" โ†’ Double-click โ†’ Trust โ†’ "Always Trust"

Linux (Ubuntu/Debian):

sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /usr/local/share/ca-certificates/mitmproxy.crt
sudo update-ca-certificates

Windows: Navigate to %USERPROFILE%\.mitmproxy\. Double-click mitmproxy-ca-cert.p12 (or mitmproxy-ca-cert.cer) to open the certificate import wizard โ†’ Install Certificate โ†’ Local Machine โ†’ place in Trusted Root Certification Authorities โ†’ Finish.

2. Start Watch Mode and Record Sessions

lli watch

If you need to capture traffic to a custom or self-hosted API , use --include with a glob pattern, for example:

lli watch --include "*api.example.com*"

In watch mode:

  • Press Enter to start recording a session
  • Press Enter again to stop recording and automatically process the session
  • Press Esc while recording to cancel the current session (no output generated)
  • Ctrl+C to exit watch mode

3. Configure Your Application and Start Dialogue (New Terminal)

export HTTP_PROXY=http://127.0.0.1:9090
export HTTPS_PROXY=http://127.0.0.1:9090
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
# Optional: bypass proxy for some hosts (e.g. localhost). Configure in lli.toml as no_proxy or run: lli config --proxy-help

# Run Claude and start your conversation
claude
# Now start your dialogue - all prompts and responses will be captured

4. (Optional) Corporate network: upstream CA certificate

If you capture traffic behind a corporate proxy or on a network where upstream servers use a company-signed certificate, you may see TLS errors because the proxy only trusts the system CAs, not your companyโ€™s CA. Configure the upstream trust CA so LLI (mitmproxy) can verify connections to the corporate proxy or target hosts:

  • Client โ†’ LLI: Your app must trust the mitmproxy CA (install ~/.mitmproxy/mitmproxy-ca-cert.pem as in step 1).
  • LLI โ†’ upstream (corporate proxy / target): LLI must trust the company CA. Set the path to your companyโ€™s root or intermediate CA (PEM file):
# Option A: CLI
lli watch --upstream-ca-cert /path/to/corporate-ca.pem

Use lli config --show to confirm the upstream CA path. If the file does not exist at startup, LLI will exit with an error.

5. Visualize with Web UI

The web interface should be launched in http://127.0.0.1:8000 to analyze captured conversations:

In the UI, you can:

  • Browse captured sessions in the sidebar
  • View conversation flow between requests and responses
  • Inspect detailed API payloads and metadata
  • Search and filter through captured data
  • Copy formatted content for further analysis

๐ŸŽฌ How Watch Mode Works

Watch mode uses a state machine with three states:

State Description
IDLE Monitoring traffic, waiting for you to start a session
RECORDING Capturing traffic with session ID injection
PROCESSING Auto-extracting, merging, and splitting session data

Example Session

$ lli watch

โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
โ”‚   LLI Watch Mode        โ”‚
โ”‚ Continuous Capture      โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ

  Proxy Port:    9090
  Output Dir:    ./traces (or OS-specific logs directory)
  Global Log:    traces/all_captured_20251203_220000.jsonl

Configure your application:
  export HTTP_PROXY=http://127.0.0.1:9090
  export HTTPS_PROXY=http://127.0.0.1:9090
  export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem

โ— [IDLE] Monitoring on :9090... Logging to all_captured_20251203_220000.jsonl
  Press [Enter] to START Session 1
<Enter>

โ—‰ [REC] Session 01_session_20251203_223010 is recording...
  Press [Enter] to STOP & PROCESS, [Esc] to CANCEL
<Enter>

โณ [BUSY] Processing Session 01_session_20251203_223010...
  โœ” Saved to traces/01_session_20251203_223010/

โ— [IDLE] Monitoring on :9090... Logging to all_captured_20251203_220000.jsonl
  Press [Enter] to START Session 2

Output Structure

./traces/                                    # Root output directory
โ”œโ”€โ”€ all_captured_20251203_220000.jsonl       # Global log (all traffic)
โ”‚
โ”œโ”€โ”€ 01_session_20251203_223010/              # Session 01 folder
โ”‚   โ”œโ”€โ”€ raw.jsonl                            # Clean session data
โ”‚   โ”œโ”€โ”€ merged.jsonl                         # Merged conversations
โ”‚   โ””โ”€โ”€ split_output/                        # Individual files
โ”‚       โ”œโ”€โ”€ 001_request_2025-12-03_22-30-10.json
โ”‚       โ””โ”€โ”€ 001_response_2025-12-03_22-30-10.json
โ”‚
โ”œโ”€โ”€ 02_session_20251203_224500/              # Session 02 folder
โ””โ”€โ”€ ...

๐Ÿ“‹ CLI Reference

lli watch

Start watch mode for continuous session capture (recommended).

lli watch [OPTIONS]

Options:
  -p, --port INTEGER           Proxy server port (default: 9090)
  -o, --output-dir, --log-dir PATH  Root output directory (default: ./traces or OS log dir)
  -i, --include TEXT           Additional URL patterns to include (glob pattern)
  --upstream-ca-cert PATH      Path to PEM or CA bundle for trusting upstream (e.g. corporate proxy) certificates
  --debug                  Enable debug mode with verbose logging

Examples:

# Basic watch mode
lli watch

# Custom port and output directory
lli watch --port 8888 --output-dir ./my_traces

# Include custom API endpoint (glob pattern)
lli watch --include "*my-custom-api.com*"

# Corporate network: trust company CA so upstream TLS (proxy/target) is verified
lli watch --upstream-ca-cert /path/to/corporate-ca.pem

# Match all subdomains of a domain
lli watch --include "*api.example.com*"

Glob Pattern Syntax:

Pattern Description
* Matches any characters
? Matches a single character
[seq] Matches any character in seq
[!seq] Matches any character not in seq

lli config

Display configuration and setup help.

lli config --cert-help    # Certificate installation instructions
lli config --proxy-help   # Proxy configuration instructions
lli config --show         # Show current configuration

lli stats

Display statistics for a captured trace file.

lli stats traces/01_session_xxx/raw.jsonl

๐Ÿ”ง Supported LLM Providers

LLI is pre-configured to capture traffic from:

Provider API Domain
Anthropic api.anthropic.com
OpenAI api.openai.com
Google generativelanguage.googleapis.com
Together api.together.xyz
Groq api.groq.com
Mistral api.mistral.ai
Cohere api.cohere.ai
DeepSeek api.deepseek.com

Add custom providers with --include (using glob patterns):

lli watch --include "*my-custom-api.com*"

๐Ÿ› Troubleshooting

SSL Certificate Error

Problem: SSL: CERTIFICATE_VERIFY_FAILED

Solution: Install the mitmproxy CA certificate. Run lli config --cert-help for instructions.

Node.js Apps Not Working

Problem: Requests hang or timeout when using Claude Code, Cursor, etc.

Solution: Set the NODE_EXTRA_CA_CERTS environment variable:

export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem

TLS / handshake errors behind corporate proxy

Problem: Upstream TLS handshake failures when capturing company URLs (e.g. traffic goes through a corporate proxy that uses a company CA).

Solution: Configure the upstream trust CA so LLI can verify the corporate proxy or target server certificate. Use --upstream-ca-cert, or set proxy.upstream_ca_cert in lli.toml, or LLI_UPSTREAM_CA_CERT. See the "Corporate network: upstream CA certificate" section above.

No Traffic Captured

Problem: Watch mode is running but no requests are logged

Solution:

  1. Verify proxy environment variables are set correctly
  2. Make sure the URL matches the default patterns (or add --include)
  3. Check lli config --show to see current filter patterns

๐Ÿ“œ License

MIT License

๐Ÿค Contributing

Contributions are welcome! Please feel free to submit a Pull Request. Before committing from a fresh clone, run lli-dev-setup once to install the repository's pre-commit hook locally.

๐Ÿ“ž Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_interceptor-2.9.0.tar.gz (5.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_interceptor-2.9.0-py3-none-any.whl (5.7 MB view details)

Uploaded Python 3

File details

Details for the file llm_interceptor-2.9.0.tar.gz.

File metadata

  • Download URL: llm_interceptor-2.9.0.tar.gz
  • Upload date:
  • Size: 5.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for llm_interceptor-2.9.0.tar.gz
Algorithm Hash digest
SHA256 c3ed56ba96136507cff2b91d3c3a163555db76b5af9aecfa31307333db9ab57c
MD5 e41b6f5ef9994c9aef786433fbc19204
BLAKE2b-256 713e5ce8d04da29285866b474b62467c09748ed7c404675fcb58c54937122b9e

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_interceptor-2.9.0.tar.gz:

Publisher: publish.yml on chouzz/llm-interceptor

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_interceptor-2.9.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_interceptor-2.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1b94e34227f4528a50d3b873f933e99581470a09950f1cd25c7a3ba7fcc49fda
MD5 f778dc3c854cefca862192179d37eac6
BLAKE2b-256 0f50a8655b9fe75af24881a3652f202df0dd33f97028ccfabf03b0721e94a7c7

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_interceptor-2.9.0-py3-none-any.whl:

Publisher: publish.yml on chouzz/llm-interceptor

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page