Profile AI agents by intercepting LLM API traffic through a local MITM proxy
Project description
AgentLens
Profile AI agents by intercepting LLM API traffic through a local MITM proxy. Understand how agents work: prompts, tools, MCP, token usage, costs, and timing — all in a real-time web UI.
Prerequisites
- Python >= 3.11
- uv
- Node.js >= 18
Install
git clone https://github.com/agenticloops/agentlens.git
cd agentlens
make install
Run
# Start the proxy (port 8080) and web UI (port 8081)
agentlens start
# Or equivalently
uv run agentlens start
make dev
This opens the web UI at http://127.0.0.1:8081 and starts the MITM proxy on port 8080.
CLI Options
agentlens start [OPTIONS]
Options:
--proxy-port INT Port for the MITM proxy [default: 8080]
--web-port INT Port for the web UI [default: 8081]
--host TEXT Host to bind to [default: 127.0.0.1]
--session-name TEXT Name for this profiling session [default: auto-generated]
--db-path TEXT Path to SQLite database [default: ~/.agentlens/data.db]
--open/--no-open Open web UI in browser [default: --open]
Certificate Setup
On first run, mitmproxy generates a CA certificate at ~/.mitmproxy/. You need to either trust this certificate or disable SSL verification for your agent to work through the proxy.
Trust the certificate system-wide (macOS)
sudo security add-trusted-cert -d -r trustRoot \
-k /Library/Keychains/System.keychain \
~/.mitmproxy/mitmproxy-ca-cert.pem
Trust the certificate system-wide (Linux)
# Debian/Ubuntu
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /usr/local/share/ca-certificates/mitmproxy.crt
sudo update-ca-certificates
# RHEL/Fedora
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /etc/pki/ca-trust/source/anchors/mitmproxy.pem
sudo update-ca-trust
Per-tool certificate environment variables
Instead of trusting system-wide, you can point individual tools to the cert:
# Python (requests / urllib3)
export REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem
# Python (httpx)
export SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem
# Node.js
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem
# curl
curl --cacert ~/.mitmproxy/mitmproxy-ca-cert.pem ...
Skip verification entirely (not recommended for production)
# Node.js
export NODE_TLS_REJECT_UNAUTHORIZED=0
# Python
export PYTHONHTTPSVERIFY=0
Usage
Start the profiler in one terminal, then launch your agent in another with proxy environment variables set. The profiler captures all LLM API traffic transparently — no code changes needed.
Claude Code
# Terminal 1
agentlens start --session-name "claude-code-debug-session"
# Terminal 2
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
claude
Or skip cert verification:
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_TLS_REJECT_UNAUTHORIZED=0 \
claude
Codex CLI
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
codex
OpenAI Python SDK
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_agent.py
Anthropic Python SDK
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_claude_agent.py
LangChain / LlamaIndex / Any Python Agent
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem \
SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_langchain_agent.py
Node.js Agents (Vercel AI SDK, etc.)
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
node my_agent.js
curl (Quick Test)
curl https://api.anthropic.com/v1/messages \
-x http://127.0.0.1:8080 \
--cacert ~/.mitmproxy/mitmproxy-ca-cert.pem \
-H "x-api-key: $ANTHROPIC_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-H "content-type: application/json" \
-d '{
"model": "claude-haiku-4",
"max_tokens": 128,
"messages": [{"role": "user", "content": "Hello!"}]
}'
What Gets Captured
| Provider | Hosts | Paths |
|---|---|---|
| OpenAI | api.openai.com |
/v1/chat/completions, /v1/responses |
| Anthropic | api.anthropic.com |
/v1/messages |
All other HTTP traffic passes through the proxy transparently without being captured or stored.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agentlens_proxy-0.1.1.tar.gz.
File metadata
- Download URL: agentlens_proxy-0.1.1.tar.gz
- Upload date:
- Size: 241.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fe4ba2cd9e4d8469156a44203802636bffbc12bbb21f841200233f1b769aa918
|
|
| MD5 |
01a9823c750beaaa120341853c135c1a
|
|
| BLAKE2b-256 |
aa754862431f87a3228e1c7a170c572b100d37d4ff2bc25a322cd4e31395b6d0
|
Provenance
The following attestation bundles were made for agentlens_proxy-0.1.1.tar.gz:
Publisher:
publish.yml on agenticloops-ai/agentlens
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
agentlens_proxy-0.1.1.tar.gz -
Subject digest:
fe4ba2cd9e4d8469156a44203802636bffbc12bbb21f841200233f1b769aa918 - Sigstore transparency entry: 984962719
- Sigstore integration time:
-
Permalink:
agenticloops-ai/agentlens@cd1dc58bbe1cf75465761a9b9dfc872ff6a16c57 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/agenticloops-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@cd1dc58bbe1cf75465761a9b9dfc872ff6a16c57 -
Trigger Event:
release
-
Statement type:
File details
Details for the file agentlens_proxy-0.1.1-py3-none-any.whl.
File metadata
- Download URL: agentlens_proxy-0.1.1-py3-none-any.whl
- Upload date:
- Size: 256.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5583846a5ff22e12b0a0caaf70b6f451f32ddcb92d86b3e6e23c4b6b32bf7681
|
|
| MD5 |
dd69144ff824281c07275a159d3978d1
|
|
| BLAKE2b-256 |
49179ac04cd8687f94b53fb811004fbaa0873623ce6770fc6cc71b7c988afcdc
|
Provenance
The following attestation bundles were made for agentlens_proxy-0.1.1-py3-none-any.whl:
Publisher:
publish.yml on agenticloops-ai/agentlens
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
agentlens_proxy-0.1.1-py3-none-any.whl -
Subject digest:
5583846a5ff22e12b0a0caaf70b6f451f32ddcb92d86b3e6e23c4b6b32bf7681 - Sigstore transparency entry: 984962721
- Sigstore integration time:
-
Permalink:
agenticloops-ai/agentlens@cd1dc58bbe1cf75465761a9b9dfc872ff6a16c57 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/agenticloops-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@cd1dc58bbe1cf75465761a9b9dfc872ff6a16c57 -
Trigger Event:
release
-
Statement type: