Skip to main content

Charles Proxy MCP server with live capture, structured traffic analysis, and agent-friendly tool contracts

Project description

Charles MCP Server

PyPI version License Python

Chinese README | Tool Contract

Charles MCP Server connects Charles Proxy to MCP clients so an agent can inspect live traffic, analyze saved recordings, and expand individual requests only when needed.

It focuses on three things:

  • reading incremental traffic from the current Charles session while recording is still active
  • keeping live and history analysis on structured paths instead of exposing raw dump dictionaries first
  • using summary-first outputs so the agent can find hotspots before pulling detail

Quick Start

1. Enable the Charles Web Interface

In Charles, open: Proxy -> Web Interface Settings

Make sure:

  • Enable web interface is checked
  • username is admin
  • password is 123456

Menu location:

Charles Web Interface Menu

Settings dialog:

Charles Web Interface Settings

2. Install and configure your MCP client

No cloning, no manual virtualenv. Requires uv.

Claude Code CLI

claude mcp add-json charles '{
  "type": "stdio",
  "command": "uvx",
  "args": ["charles-mcp"],
  "env": {
    "CHARLES_USER": "admin",
    "CHARLES_PASS": "123456",
    "CHARLES_MANAGE_LIFECYCLE": "false"
  }
}'

Claude Desktop / Cursor / generic JSON config

{
  "mcpServers": {
    "charles": {
      "command": "uvx",
      "args": ["charles-mcp"],
      "env": {
        "CHARLES_USER": "admin",
        "CHARLES_PASS": "123456",
        "CHARLES_MANAGE_LIFECYCLE": "false"
      }
    }
  }
}

Codex CLI

[mcp_servers.charles]
command = "uvx"
args = ["charles-mcp"]

[mcp_servers.charles.env]
CHARLES_USER = "admin"
CHARLES_PASS = "123456"
CHARLES_MANAGE_LIFECYCLE = "false"

Auto-install via AI agent

Copy-paste the following prompt into any AI agent (Claude Code, ChatGPT, Gemini CLI, Cursor Agent, etc.) and it will install and configure charles-mcp automatically:

Auto-install Recommended

🔴 Click to expand auto-install prompt (Recommended)
Install the "charles-mcp" MCP server and configure it for my MCP client. Follow these steps exactly:

Step 1 — Detect OS:
  Determine if this machine runs Windows, macOS, or Linux.

Step 2 — Ensure uv is installed:
  Run: uv --version
  If the command fails (uv not found):
    - macOS/Linux: run: curl -LsSf https://astral.sh/uv/install.sh | sh
    - Windows: run: powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
  After installing, verify uv works: uv --version

Step 3 — Detect which MCP client is installed:
  Check in this order and use the FIRST match:

  a) Claude Code — run: claude --version
     If it succeeds, run this command and skip to Step 5:
       claude mcp add-json charles '{"type":"stdio","command":"uvx","args":["charles-mcp"],"env":{"CHARLES_USER":"admin","CHARLES_PASS":"123456","CHARLES_MANAGE_LIFECYCLE":"false"}}'

  b) Claude Desktop — check if config file exists:
     - macOS:   ~/Library/Application Support/Claude/claude_desktop_config.json
     - Windows: %APPDATA%\Claude\claude_desktop_config.json
     - Linux:   ~/.config/Claude/claude_desktop_config.json

  c) Cursor — check if any of these exist:
     - ~/.cursor/mcp.json
     - .cursor/mcp.json (in current project)

  d) Windsurf — check if exists:
     - ~/.codeium/windsurf/mcp_config.json

  If none detected, ask the user which client they use.

Step 4 — Write config (for clients b/c/d):
  The config entry to add is:
    "charles": {
      "command": "uvx",
      "args": ["charles-mcp"],
      "env": {
        "CHARLES_USER": "admin",
        "CHARLES_PASS": "123456",
        "CHARLES_MANAGE_LIFECYCLE": "false"
      }
    }

  IMPORTANT: If the config file already exists, read it first, parse the JSON,
  add the "charles" key inside "mcpServers" (create "mcpServers" if absent),
  and write back. Do NOT overwrite other existing MCP server entries.
  If the file does not exist, create it with:
    { "mcpServers": { "charles": { ... } } }

Step 5 — Verify:
  Run: uvx charles-mcp
  Wait 3 seconds then terminate the process.
  If it starts without import errors, the installation is successful.

Step 6 — Report:
  Print: "charles-mcp installed successfully. Please restart your MCP client to load the new server."
  Also mention that Charles Proxy must be running with Web Interface enabled (Proxy → Web Interface Settings, username: admin, password: 123456).

Requirements

  • Python 3.10+
  • Charles Proxy running locally
  • Charles Web Interface enabled
  • Charles proxy listening on 127.0.0.1:8888

CHARLES_MANAGE_LIFECYCLE=false is the recommended default. Unless you explicitly want the MCP server to manage Charles lifecycle, do not let it shut down your own Charles process.

Environment Variables

Variable Default Purpose
CHARLES_USER admin Charles Web Interface username
CHARLES_PASS 123456 Charles Web Interface password
CHARLES_PROXY_HOST 127.0.0.1 Charles proxy host
CHARLES_PROXY_PORT 8888 Charles proxy port
CHARLES_CONFIG_PATH auto-detect Charles config file path
CHARLES_REQUEST_TIMEOUT 10 Control-plane HTTP timeout in seconds
CHARLES_MAX_STOPTIME 3600 Maximum bounded recording length
CHARLES_MANAGE_LIFECYCLE false Whether the MCP server should manage Charles startup and shutdown

Recommended Flows

Live analysis

  1. start_live_capture
  2. group_capture_analysis
  3. query_live_capture_entries
  4. get_traffic_entry_detail
  5. stop_live_capture

This path is optimized for finding hotspots first, then drilling down into one confirmed request.

History analysis

  1. list_recordings
  2. analyze_recorded_traffic
  3. group_capture_analysis(source="history")
  4. get_traffic_entry_detail

This path is optimized for browsing saved recordings and then drilling into selected entries.

Current Version Highlights

  • read_live_capture and peek_live_capture now return route-level summary fields only, such as host, method, path, and status, instead of raw Charles entries. This keeps repeated polling from blowing up the context window.
  • query_live_capture_entries is now a read-only analysis path and does not advance the live cursor. You can reuse the same capture_id with different filters without consuming the historical increment.
  • analyze_recorded_traffic and query_live_capture_entries summaries now expose matched_fields and match_reasons, so an agent can explain why a request was selected.
  • get_traffic_entry_detail now defaults to include_full_body=false and max_body_chars=2048. When the estimated detail payload exceeds about 12,000 characters, the tool adds a warning suggesting a narrower request.
  • Summary and detail output automatically strip null values and hide internal fields such as header_map, parsed_json, parsed_form, and lower_name. Use the headers list when you need header values.

Tool Catalog

This README documents the recommended tool surface only. Compatibility-only aliases are intentionally not explained here.

Live capture tools

Tool What it does Typical use
start_live_capture Starts or adopts the current live capture and returns capture_id Before realtime inspection begins
read_live_capture Reads incremental live entries by cursor and returns compact route summaries only When consuming new traffic continuously and you only need host/path/status first
peek_live_capture Previews new live entries without advancing the cursor and returns compact route summaries only When you want to inspect new traffic without moving the reader state
stop_live_capture Stops the capture and optionally persists a snapshot When closing or exporting a live session
query_live_capture_entries Produces structured summary output for a live capture without advancing the cursor When repeatedly filtering high-value requests out of current traffic

Analysis tools

Tool What it does Typical use
group_capture_analysis Aggregates live or history traffic by group key When you want the lowest-token hotspot view
get_capture_analysis_stats Returns coarse traffic class counts When you want a quick distribution view
get_traffic_entry_detail Loads detail for one specific entry and warns when the payload is too large After you already identified a target entry_id
analyze_recorded_traffic Produces structured summary output for a saved recording with match reasons When analyzing a .chlsj snapshot

History tools

Tool What it does Typical use
list_recordings Lists saved recording files Before choosing a historical snapshot
get_recording_snapshot Loads the raw content of one saved recording When you need the stored snapshot itself
query_recorded_traffic Applies lightweight filtering to the latest saved recording When you need a quick host, method, or regex query

Status and control tools

Tool What it does Typical use
charles_status Reports Charles connectivity and active capture state When checking whether Charles is reachable or capture is still active
throttling Applies a Charles network throttling preset When simulating 3G, 4G, 5G, or disabling throttling
reset_environment Restores Charles configuration and clears the current environment When you need to return to a clean baseline

Key Behavior

1. Raw values are returned by default

This version no longer redacts request or response content:

  • summary, detail, live, and history outputs all return raw values
  • include_sensitive is retained only for compatibility and no longer changes results

2. Summary comes before detail

Use group_capture_analysis, query_live_capture_entries, or analyze_recorded_traffic first, then call get_traffic_entry_detail only for a confirmed target.

Do not default to include_full_body=true unless there is a clear reason.

3. Output is optimized for token budgets

All summary and detail outputs have been serialized lean:

  • Internal fields like header_map, parsed_json, parsed_form, and lower_name are excluded from tool output
  • null values are stripped automatically during serialization
  • When full_text is present in a detail view, the redundant preview_text is removed

Default parameters have been lowered to protect the context window:

Parameter Old default New default
max_items 20 10
max_preview_chars 256 128
max_headers_per_side 8 6
max_body_chars 4096 2048

Higher values can still be passed explicitly when a wider view is needed.

4. History detail needs stable source identity

History summaries return recording_path. Live summaries return capture_id.

For get_traffic_entry_detail:

  • prefer recording_path for history
  • prefer capture_id for live

5. stop_live_capture failures are recoverable

stop_live_capture has two stable end states:

  • status="stopped" means the capture is actually closed
  • status="stop_failed" means a short retry also failed but the capture is still preserved

When the result is:

{
  "status": "stop_failed",
  "recoverable": true,
  "active_capture_preserved": true
}

the capture is still readable and can be diagnosed or stopped again later.

Development

Run tests:

python -m pytest -q

Useful local checks:

python charles-mcp-server.py
python -c "from charles_mcp.main import main; main()"

Acknowledgments

This project was inspired by tianhetonghua/Charles-mcp-server, and that earlier work deserves explicit credit. It proved that connecting Charles Proxy to MCP for AI-driven traffic analysis was a valid and useful direction.

At the same time, that project was not enough for the problem I wanted to solve, which is why charles-mcp is a complete rewrite from scratch rather than a small fork or patch series. The earlier project is oriented more toward reverse engineering and security workflows, with capabilities centered on harvesting, keyword interlocks, encryption detection, and task-scoped cache management. This repository targets a different job: making Charles usable as a stable, low-token, repeatable MCP server for general-purpose AI agents in clients such as Claude Code, Codex, and Cursor.

The rewrite was driven by concrete gaps I needed to solve:

  • a unified model for live capture and history analysis, instead of forcing agents to switch between separate harvesting and filtering mental models
  • summary-first, detail-on-demand outputs so agents do not immediately consume large raw dumps and blow up the context window
  • stable capture_id, cursor, and recording_path semantics so repeated queries do not accidentally consume or lose live traffic state
  • stricter tool contracts, recovery behavior, and protocol consistency for the reliability expectations of the AI agent ecosystem

So this repository is not a cosmetic variation on the earlier one. It is a full rebuild for a different operating model: more structured, more predictable, and better suited to agents that need to reason over live and historical Charles traffic without fighting the tool surface.

Support

If this project helps your work, you can support future maintenance and iteration.

WeChat donation QR

WeChat donation QR

USDT-TRC20

TCudxn9ByCxPZHXLtvqBjFmLWXywBoicRs

Changelog

2026-04-13 (v2.0.2)

  • Added GitHub Actions release automation for PyPI publishing via Trusted Publisher (OIDC).
  • Added release gating with version/tag verification and twine check --strict.
  • Improved release visibility and metadata so GitHub Release and PyPI publishing stay aligned.

2026-03-27 (v2.0.1)

  • Restricted history snapshot access to managed .chlsj files so the server no longer exposes arbitrary local JSON reads through recording-path inputs.
  • Fixed live analysis so scan_limit is actually honored instead of silently stopping at a small fixed scan window.
  • Fixed request_body_contains and response_body_contains so matching is no longer limited to clipped preview text.
  • Moved installed-runtime snapshots and backups to a user state directory instead of writing runtime data into the package install tree.
  • Published 2.0.1 with the fixes above and synced the release across GitHub and PyPI.

See Also

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

charles_mcp-2.0.2.tar.gz (74.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

charles_mcp-2.0.2-py3-none-any.whl (62.3 kB view details)

Uploaded Python 3

File details

Details for the file charles_mcp-2.0.2.tar.gz.

File metadata

  • Download URL: charles_mcp-2.0.2.tar.gz
  • Upload date:
  • Size: 74.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for charles_mcp-2.0.2.tar.gz
Algorithm Hash digest
SHA256 7926cf642b7df70d843db3881dff29b85aef16317b5c2ccf353e8cb1c01e6ced
MD5 0b3948147ba95bbd68cb47ec19ea741a
BLAKE2b-256 e657c18a666b86fe7d17251c8502b0416e92fc2bf171c75de5a00dce43f1e251

See more details on using hashes here.

Provenance

The following attestation bundles were made for charles_mcp-2.0.2.tar.gz:

Publisher: publish.yml on heizaheiza/Charles-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file charles_mcp-2.0.2-py3-none-any.whl.

File metadata

  • Download URL: charles_mcp-2.0.2-py3-none-any.whl
  • Upload date:
  • Size: 62.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for charles_mcp-2.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 85f70d37a857ba34d0323ae2d3c8938f82530c8bcc8718557e5c93822449d711
MD5 d170b5b552c1f94d67dcc034e7ba7913
BLAKE2b-256 6a31d29207249630289ba02be0053ecdf298007300185b7e1947e845b3376a9e

See more details on using hashes here.

Provenance

The following attestation bundles were made for charles_mcp-2.0.2-py3-none-any.whl:

Publisher: publish.yml on heizaheiza/Charles-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page