Skip to main content

Universal context manager for AI coding agents

Project description

🧠 context-pilot-mcp

PyPI version License: MIT

Universal context manager for AI coding agents.

context-pilot-mcp tracks file relevance, compresses stale files via AST, and evicts dead weight to keep your AI agent's context window highly optimized.

No manual bookkeeping required. By using the provided read_file tool, the engine automatically handles event tracking, token counting, and structural compression in the background. It is natively compatible with Claude Code, Cursor, VS Code (Cline/Roo), and more.


🛑 The Problem

AI coding agents (like Aider or Claude Code) are great at pulling files into context, but terrible at letting them go. Over a multi-turn session, the context window fills up with files that were only relevant 10 turns ago.

  • Local model users (4K-32K windows) crash or overflow early.
  • Cloud users pay massive, unnecessary API costs for carrying dead weight.
  • "Lost in the middle" degradation causes AI to hallucinate or ignore instructions.

🚀 The V0.2.0 Pivot: Transparent Proxy

Unlike other context managers that require the LLM to remember to "track" its own actions, context-pilot-mcp acts as a transparent middleware:

  1. Automatic Tracking: Whenever the agent reads a file through the MCP, we log the event and update the staleness matrix.
  2. Dynamic Compression: If a file is stale, read_file automatically returns a tree-sitter compressed version (signatures only) to save tokens without the LLM even knowing.
  3. Multi-Language AST: Supports Python, JavaScript, TypeScript, and Go out of the box using tree-sitter.

📊 Benchmarks (Aider Codebase)

When tested on massive Python files, the "compress-before-drop" AST engine yields massive token savings without blinding the LLM to the codebase structure:

File Original Tokens Compressed Tokens Savings
base_coder.py 21,575 7,858 64%
commands.py 15,557 5,796 63%
repomap.py 6,825 2,080 70%
Total 43,957 15,734 64%

In a simulated 10-turn coding session with a 16K budget, context-pilot-mcp successfully reclaimed 23,479 tokens dynamically without deleting active context.


📦 Installation

Install globally on your system:

pip install context-pilot-mcp

🔌 Connecting to your AI Agents

Because it utilizes the open Model Context Protocol (MCP), integration is native and takes seconds.

1. Claude Code

Register the server globally via the Claude CLI:

claude mcp add context-pilot-mcp context-pilot-mcp

2. Cursor IDE

  1. Open Settings (Cmd/Ctrl + ,).
  2. Go to Features > MCP Servers and click + Add new MCP server.
  3. Set Name to context-pilot, Type to command, and Command to context-pilot-mcp.
  4. Save.

🛠️ Provided MCP Tools

Once attached, your AI agent automatically gains the following capabilities:

  • read_file(path): The Primary Hook. Replaces standard file reading. Automatically tracks usage, counts tokens, and applies AST-compression if the file is stale.
  • optimize_context(budget, pressure): Imperative Enforcement. Evaluates context and provides the agent with a list of files it MUST drop or compress to stay within limits.
  • status(): Returns a tabular visualization of all loaded context, their tokens, turn ages, and staleness scores.
  • start_task(): Explicitly advances the turn counter for a new task.
  • drop_file(path): Notifies the engine that a file has been purged.

⚙️ Configuration

You can tune the engine's behavior by placing a .context-pilot.toml file in the root of your project:

[context-pilot]
mode = "suggest"            # "suggest", "auto", or "off"
pressure_threshold = 0.8    # Trigger eviction when window is 80% full
stale_after_turns = 5       # Unreferenced turns before considering stale
edit_stickiness = 3         # Edited files stay in context for N extra turns
compress_before_drop = true # Try AST compression before full eviction
max_context_tokens = 128000 # Your model's context window size
min_staleness_to_flag = 0.6 # Only flag files that are at least 60% stale

License: MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

context_pilot_mcp-0.2.0.tar.gz (21.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

context_pilot_mcp-0.2.0-py3-none-any.whl (19.9 kB view details)

Uploaded Python 3

File details

Details for the file context_pilot_mcp-0.2.0.tar.gz.

File metadata

  • Download URL: context_pilot_mcp-0.2.0.tar.gz
  • Upload date:
  • Size: 21.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.4

File hashes

Hashes for context_pilot_mcp-0.2.0.tar.gz
Algorithm Hash digest
SHA256 16e4185398eaa5b250c8d562d3ef54e2f86ed780c89ccc1052312cf4c6a35395
MD5 3a04f84a88b5a3a7417f283449b107c9
BLAKE2b-256 a71fc9194706f6e8063b686615b841b225ff9b089b6ac71adb1c250f64a2fd74

See more details on using hashes here.

File details

Details for the file context_pilot_mcp-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for context_pilot_mcp-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 560acd894ef18258a791725f4345bdccf9fe4441a23bf1c8fb5d1003aeb7c9cb
MD5 132711eec775d49bd552a9a21fa49695
BLAKE2b-256 89d1a72ed8943556df0c74f99b58816a7aa0f198cff5a7dbf38f3d20f683f0eb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page