Skip to main content

Profile AI agents by intercepting LLM API traffic through a local MITM proxy

Project description

AgentLens

Profile AI agents by intercepting LLM API traffic through a local MITM proxy. Understand how agents work: prompts, tools, MCP, token usage, costs, and timing — all in a real-time web UI.

AgentLens

Supported Providers

Provider Hosts Endpoints
Anthropic api.anthropic.com /v1/messages
OpenAI api.openai.com /v1/chat/completions, /v1/responses
Google Gemini generativelanguage.googleapis.com, cloudcode-pa.googleapis.com :generateContent, :streamGenerateContent
GitHub Copilot api.individual.githubcopilot.com, api.business.githubcopilot.com, api.enterprise.githubcopilot.com All of the above (auto-detected by host)

All other HTTP traffic passes through the proxy transparently without being captured or stored.

Providers are auto-discovered plugins — adding a new one requires no changes to core code.

Quickstart

# 1. Install
pip install agentlens-proxy

# 2. Start the profiler (opens web UI automatically)
agentlens start

# 3. In another terminal, run your agent through the proxy
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_TLS_REJECT_UNAUTHORIZED=0 \
claude

That's it — open http://127.0.0.1:8081 to see every LLM request in real time.

Prerequisites

  • Python >= 3.11
  • uv
  • Node.js >= 18

Install

pip install agentlens-proxy

From source

git clone https://github.com/agenticloops-ai/agentlens.git
cd agentlens
make install

Run

# Start the proxy (port 8080) and web UI (port 8081)
agentlens start

# Or equivalently
uv run agentlens start
make dev

This opens the web UI at http://127.0.0.1:8081 and starts the MITM proxy on port 8080.

CLI Options

agentlens start [OPTIONS]

Options:
  --proxy-port   INT   Port for the MITM proxy          [default: 8080]
  --web-port     INT   Port for the web UI               [default: 8081]
  --host         TEXT  Host to bind to                   [default: 127.0.0.1]
  --session-name TEXT  Name for this profiling session   [default: auto-generated]
  --db-path      TEXT  Path to SQLite database           [default: ~/.agentlens/data.db]
  --open/--no-open     Open web UI in browser            [default: --open]

Certificate Setup

On first run, mitmproxy generates a CA certificate at ~/.mitmproxy/. You need to either trust this certificate or disable SSL verification for your agent to work through the proxy.

Trust the certificate system-wide (macOS)

sudo security add-trusted-cert -d -r trustRoot \
  -k /Library/Keychains/System.keychain \
  ~/.mitmproxy/mitmproxy-ca-cert.pem

Trust the certificate system-wide (Linux)

# Debian/Ubuntu
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /usr/local/share/ca-certificates/mitmproxy.crt
sudo update-ca-certificates

# RHEL/Fedora
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem /etc/pki/ca-trust/source/anchors/mitmproxy.pem
sudo update-ca-trust

Per-tool certificate environment variables

Instead of trusting system-wide, you can point individual tools to the cert:

# Python (requests / urllib3)
export REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem

# Python (httpx)
export SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem

# Node.js
export NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem

# curl
curl --cacert ~/.mitmproxy/mitmproxy-ca-cert.pem ...

Skip verification entirely (not recommended for production)

# Node.js
export NODE_TLS_REJECT_UNAUTHORIZED=0

# Python
export PYTHONHTTPSVERIFY=0

Usage

Start the profiler in one terminal, then launch your agent in another with proxy environment variables set. The profiler captures all LLM API traffic transparently — no code changes needed.

Claude Code

# Terminal 1
agentlens start --session-name "claude-code-debug-session"

# Terminal 2
HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
claude

Or skip cert verification:

HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_TLS_REJECT_UNAUTHORIZED=0 \
claude

Codex CLI

HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
codex

OpenAI Python SDK

HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_agent.py

Anthropic Python SDK

HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_claude_agent.py

LangChain / LlamaIndex / Any Python Agent

HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
REQUESTS_CA_BUNDLE=~/.mitmproxy/mitmproxy-ca-cert.pem \
SSL_CERT_FILE=~/.mitmproxy/mitmproxy-ca-cert.pem \
python my_langchain_agent.py

Node.js Agents (Vercel AI SDK, etc.)

HTTP_PROXY=http://127.0.0.1:8080 \
HTTPS_PROXY=http://127.0.0.1:8080 \
NODE_EXTRA_CA_CERTS=~/.mitmproxy/mitmproxy-ca-cert.pem \
node my_agent.js

curl (Quick Test)

curl https://api.anthropic.com/v1/messages \
  -x http://127.0.0.1:8080 \
  --cacert ~/.mitmproxy/mitmproxy-ca-cert.pem \
  -H "x-api-key: $ANTHROPIC_API_KEY" \
  -H "anthropic-version: 2023-06-01" \
  -H "content-type: application/json" \
  -d '{
    "model": "claude-haiku-4",
    "max_tokens": 128,
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentlens_proxy-0.1.5.tar.gz (251.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentlens_proxy-0.1.5-py3-none-any.whl (269.1 kB view details)

Uploaded Python 3

File details

Details for the file agentlens_proxy-0.1.5.tar.gz.

File metadata

  • Download URL: agentlens_proxy-0.1.5.tar.gz
  • Upload date:
  • Size: 251.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agentlens_proxy-0.1.5.tar.gz
Algorithm Hash digest
SHA256 ad80c5a71a6cc93fc28d8260b873121d0642f39a0baf42d488be524ac3ae22be
MD5 5572007b1144d288127ec90a74588e07
BLAKE2b-256 4659d4b0d7a40bb4b21909bb3f8e7ee82c14c2e0cff85710a89737eab874cd42

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentlens_proxy-0.1.5.tar.gz:

Publisher: publish.yml on agenticloops-ai/agentlens

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agentlens_proxy-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: agentlens_proxy-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 269.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agentlens_proxy-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 fc722b7b741f3b48e6c02d15d0e9bccc1dd6180c8c763c60b80c9f6c7722f7ed
MD5 2856bcf223fb3a5368f3934b64ced206
BLAKE2b-256 68f3d6d310b4b1b5ef51cd462a772a8097562be88159f7e5aaa41338b4c2944a

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentlens_proxy-0.1.5-py3-none-any.whl:

Publisher: publish.yml on agenticloops-ai/agentlens

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page