Profile AI agents by intercepting LLM API traffic through a local MITM proxy
Project description
AgentLens
Profile AI agents by intercepting LLM API traffic through a local MITM proxy. Understand how agents work: prompts, tools, MCP, token usage, costs, and timing — all in a real-time web UI.
Supported Providers
| Provider | Hosts | Endpoints |
|---|---|---|
| Anthropic | api.anthropic.com |
/v1/messages |
| OpenAI | api.openai.com |
/v1/chat/completions, /v1/responses |
| Google Gemini | generativelanguage.googleapis.com, cloudcode-pa.googleapis.com |
:generateContent, :streamGenerateContent |
| GitHub Copilot | api.individual.githubcopilot.com, api.business.githubcopilot.com, api.enterprise.githubcopilot.com |
All of the above (auto-detected by host) |
All other HTTP traffic passes through the proxy transparently without being captured or stored.
Providers are auto-discovered plugins — adding a new one requires no changes to core code.
Quickstart
# 1. Install
pip install agentlens-proxy
# 2. Start the profiler (opens web UI automatically)
agentlens start
# 3. In another terminal, run your agent through the proxy
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_TLS_REJECT_UNAUTHORIZED=0 \
claude
That's it — open http://127.0.0.1:8081 to see every LLM request in real time.
For workloads that do not honor HTTP_PROXY and need host-boundary interception, use transparent capture instead:
sudo agentlens capture --mode transparent --target-host api.anthropic.com --label cowork
Prerequisites
- Python >= 3.11
- uv
- Node.js >= 18
Install
pip install agentlens-proxy
From source
git clone https://github.com/agenticloops-ai/agentlens.git
cd agentlens
make install
Run
# Start the proxy (port 8080) and web UI (port 8081)
agentlens start
# Start transparent capture for VM-like workloads (requires sudo on macOS)
sudo agentlens capture --mode transparent --target-host api.anthropic.com --label cowork
# Or equivalently
uv run agentlens start
make dev
This opens the web UI at http://127.0.0.1:8081 and starts the MITM proxy on port 8080.
CLI Options
agentlens start
Start the proxy and web UI.
agentlens start [OPTIONS]
Options:
--proxy-port INT Port for the MITM proxy [default: 8080]
--web-port INT Port for the web UI [default: 8081]
--host TEXT Host to bind to [default: 127.0.0.1]
--session-name TEXT Name for this profiling session [default: auto-generated]
--db-path TEXT Path to SQLite database [default: ~/.agentlens/data.db]
--open/--no-open Open web UI in browser [default: --open]
agentlens wait
Start proxy and wait for Ctrl+C, then export results. Useful for headless/scripted capture.
agentlens wait [OPTIONS]
Options:
--output TEXT Output directory for exported files [default: results]
--formats TEXT Comma-separated export formats [default: json,markdown,csv]
--session-name TEXT Override auto-generated session name [default: auto-generated]
--proxy-port INT Port for the MITM proxy [default: 8080]
--web-port INT Port for the web UI [default: 8081]
--host TEXT Host to bind to [default: 127.0.0.1]
--db-path TEXT Path to SQLite database [default: ~/.agentlens/data.db]
--web/--no-web Start web UI alongside proxy [default: --web]
--open/--no-open Open web UI in browser [default: --no-open]
Example:
# Terminal 1 — start proxy and wait
agentlens wait --output results/claude-codegen
# Terminal 2 — run your agent with proxy env vars
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
claude -p "refactor the auth module"
# Press Ctrl+C in Terminal 1 when done — results are exported to:
# results/claude-codegen/2026-02-25T14-30-00/
agentlens capture
Start a capture session using either explicit proxy mode or transparent interception.
agentlens capture [OPTIONS]
Options:
--mode TEXT Capture mode: explicit_proxy or transparent
[default: explicit_proxy]
--proxy-port INT Port for the capture listener [default: 8080]
--web-port INT Port for the web UI [default: 8081]
--host TEXT Host to bind the web UI to [default: 127.0.0.1]
--session-name TEXT Name for this profiling session [default: auto-generated]
--db-path TEXT Path to SQLite database [default: ~/.agentlens/data.db]
--open / --no-open Open web UI in browser [default: --open]
--target-host TEXT Transparent capture target host [repeatable]
--target-ip TEXT Transparent capture target IP [repeatable]
--label TEXT Optional label for this capture
--pf-user TEXT Redirect only this local user's traffic
Examples:
# Generic explicit proxy capture
agentlens capture --mode explicit_proxy
# Transparent capture for Cowork/Claude local-agent traffic on macOS
sudo agentlens capture --mode transparent --target-host api.anthropic.com --label cowork
agentlens export
Export a previously captured session from the database.
agentlens export SESSION [OPTIONS]
Arguments:
SESSION Session ID or session name
Options:
--output-dir TEXT Output directory [default: exports]
--formats TEXT Comma-separated export formats [default: json,markdown,csv]
--db-path TEXT Path to SQLite database [default: ~/.agentlens/data.db]
Example:
agentlens export "Session 2026-02-25 14:30" --output-dir exports/
Certificate Setup
On first run, mitmproxy generates a CA certificate at ~/.mitmproxy/. You need to either trust this certificate or disable SSL verification for your agent to work through the proxy.
Trust the certificate system-wide (macOS)
sudo security add-trusted-cert -d -r trustRoot \
-k /Library/Keychains/System.keychain \
~/.mitmproxy/mitmproxy-ca-cert.pem
Trust the certificate system-wide (Linux)
# Debian/Ubuntu
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /usr/local/share/ca-certificates/mitmproxy.crt
sudo update-ca-certificates
# RHEL/Fedora
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /etc/pki/ca-trust/source/anchors/mitmproxy.pem
sudo update-ca-trust
Per-tool certificate environment variables
Instead of trusting system-wide, you can point individual tools to the cert:
# Python (requests / urllib3)
export REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem
# Python (httpx)
export SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem
# Node.js
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
# curl
curl --cacert ~/.mitmproxy/mitmproxy-ca-cert.pem ...
Skip verification entirely (not recommended for production)
# Node.js
export NODE_TLS_REJECT_UNAUTHORIZED=0
# Python
export PYTHONHTTPSVERIFY=0
Usage
Start the profiler in one terminal, then launch your agent in another with proxy environment variables set. The profiler captures all LLM API traffic transparently — no code changes needed.
For VM-like or GUI workloads that ignore proxy environment variables, use transparent capture mode on macOS. Transparent mode requires:
sudo- a trusted mitmproxy CA at
~/.mitmproxy/mitmproxy-ca-cert.pem - one or more
--target-hostor--target-ipvalues so PF can scope the redirect
AgentLens uses provider plugins to decide which intercepted requests are real LLM calls, so the transport remains generic while request classification stays provider-specific.
Claude Code
# Terminal 1
agentlens start --session-name "claude-code-debug-session"
# Terminal 2
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
claude
Or skip cert verification:
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_TLS_REJECT_UNAUTHORIZED=0 \
claude
Codex CLI
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
codex
OpenAI Python SDK
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_agent.py
Anthropic Python SDK
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_claude_agent.py
LangChain / LlamaIndex / Any Python Agent
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem \
SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_langchain_agent.py
Node.js Agents (Vercel AI SDK, etc.)
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
node my_agent.js
curl (Quick Test)
curl https://api.anthropic.com/v1/messages \
-x http://127.0.0.1:8080 \
--cacert ~/.mitmproxy/mitmproxy-ca-cert.pem \
-H "x-api-key: $ANTHROPIC_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-H "content-type: application/json" \
-d '{
"model": "claude-haiku-4",
"max_tokens": 128,
"messages": [{"role": "user", "content": "Hello!"}]
}'
Advanced: tmux single-command workflow
Use tmux to run the proxy and your agent side-by-side in a single session. When the agent exits, the proxy is automatically stopped and results are exported.
# 1. Start proxy in a detached tmux session
tmux new-session -d -s agentlens \
'agentlens wait --output results/my-run --no-open'
# 2. Split a pane that runs the agent, then sends Ctrl+C to the proxy on exit
tmux split-window -h -t agentlens \
'sleep 2 && \
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem \
REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem \
claude -p "refactor the auth module"; \
tmux send-keys -t agentlens:0.0 C-c'
# 3. Attach to watch it live
tmux attach -t agentlens
What happens:
- Left pane —
agentlens waitstarts the proxy and web UI, waits for Ctrl+C - Right pane — waits 2s for the proxy to be ready, then runs your agent with all proxy/cert env vars
- When the agent finishes,
tmux send-keys C-csignals the proxy to stop and export results - Results are written to
results/my-run/<timestamp>/
To also open the web UI while capturing:
tmux new-session -d -s agentlens \
'agentlens wait --output results/my-run --open'
You can swap claude -p "..." for any command — python my_agent.py, codex, node agent.js, etc.
Wrapper script for awkward workloads
Use the checked-in wrapper instead of maintaining an ad hoc shell function:
./.tools/lens-run.sh -- claude -p "refactor the auth module"
For Cowork/Claude local-agent mode, use the built-in transparent-capture preset:
./.tools/lens-run.sh --cowork -- /Applications/Claude.app/Contents/MacOS/Claude
This wrapper:
- starts AgentLens in tmux
- launches your workload in a second pane
- uses explicit proxy mode for normal CLI/SDK agents
- uses
agentlens capture --mode transparentfor--cowork
You can still use it for ordinary commands:
./.tools/lens-run.sh -o results/my-test -s my-test -- python my_agent.py
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agentlens_proxy-0.1.11.tar.gz.
File metadata
- Download URL: agentlens_proxy-0.1.11.tar.gz
- Upload date:
- Size: 264.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1ff39ec2089f22cbddadfd2b51ee9cb45c1f14c93a4ca5ff233a5091623ce8b6
|
|
| MD5 |
7ee48ba4d811f3ab31a8ad9f7a25b7ab
|
|
| BLAKE2b-256 |
efd61fac272380e2f83b21e8a3b839d22e8eccd588b717913f2e8481cadf3d7c
|
Provenance
The following attestation bundles were made for agentlens_proxy-0.1.11.tar.gz:
Publisher:
publish.yml on agenticloops-ai/agentlens
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
agentlens_proxy-0.1.11.tar.gz -
Subject digest:
1ff39ec2089f22cbddadfd2b51ee9cb45c1f14c93a4ca5ff233a5091623ce8b6 - Sigstore transparency entry: 1429966679
- Sigstore integration time:
-
Permalink:
agenticloops-ai/agentlens@f37319b985d1cd046e8a7922f4478d62f154f5da -
Branch / Tag:
refs/tags/v0.1.11 - Owner: https://github.com/agenticloops-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@f37319b985d1cd046e8a7922f4478d62f154f5da -
Trigger Event:
release
-
Statement type:
File details
Details for the file agentlens_proxy-0.1.11-py3-none-any.whl.
File metadata
- Download URL: agentlens_proxy-0.1.11-py3-none-any.whl
- Upload date:
- Size: 284.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c937f062b24f3135dfaf8202641c0c9acb1719b982bfa3861d7aa40183f85e84
|
|
| MD5 |
6a0837fff88e73fbb28271a115b673a7
|
|
| BLAKE2b-256 |
385df3a12ff4f0881a1ea6a8151d76884524992e4b17db01cc6059dc23becd6a
|
Provenance
The following attestation bundles were made for agentlens_proxy-0.1.11-py3-none-any.whl:
Publisher:
publish.yml on agenticloops-ai/agentlens
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
agentlens_proxy-0.1.11-py3-none-any.whl -
Subject digest:
c937f062b24f3135dfaf8202641c0c9acb1719b982bfa3861d7aa40183f85e84 - Sigstore transparency entry: 1429966681
- Sigstore integration time:
-
Permalink:
agenticloops-ai/agentlens@f37319b985d1cd046e8a7922f4478d62f154f5da -
Branch / Tag:
refs/tags/v0.1.11 - Owner: https://github.com/agenticloops-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@f37319b985d1cd046e8a7922f4478d62f154f5da -
Trigger Event:
release
-
Statement type: