Skip to main content

Universal context manager for AI coding agents

Project description

context-pilot-mcp

Universal context manager for AI coding agents.

context-pilot-mcp tracks file relevance, compresses stale files via AST, and evicts dead weight to keep your AI agent's context window highly optimized.

No manual bookkeeping required. By using the provided read_file tool, the engine automatically handles event tracking, token counting, and structural compression in the background. It is natively compatible with Claude Code, Cursor, VS Code (Cline/Roo), and more.


The Problem

AI coding agents (like Aider or Claude Code) are great at pulling files into context, but terrible at letting them go. Over a multi-turn session, the context window fills up with files that were only relevant 10 turns ago.

  • Local model users (4K-32K windows) crash or overflow early.
  • Cloud users pay massive, unnecessary API costs for carrying dead weight.
  • "Lost in the middle" degradation causes AI to hallucinate or ignore instructions.

The V0.2.1 Pivot: Transparent Proxy

Unlike other context managers that require the LLM to remember to "track" its own actions, context-pilot-mcp acts as a transparent middleware:

  1. Automatic Tracking: Whenever the agent reads a file through the MCP, we log the event and update the staleness matrix.
  2. Dynamic Compression: If a file is stale, read_file automatically returns a tree-sitter compressed version (signatures only) to save tokens without the LLM even knowing.
  3. Multi-Language AST: Supports Python, JavaScript, TypeScript, and Go out of the box using tree-sitter.

Benchmarks (Aider Codebase)

When tested on massive Python files, the "compress-before-drop" AST engine yields massive token savings without blinding the LLM to the codebase structure:

File Original Tokens Compressed Tokens Savings
base_coder.py 21,575 7,858 64%
commands.py 15,557 5,796 63%
repomap.py 6,825 2,080 70%
Total 43,957 15,734 64%

In a simulated 10-turn coding session with a 16K budget, context-pilot-mcp successfully reclaimed 23,479 tokens dynamically without deleting active context.


Installation

Install globally on your system:

pip install context-pilot-mcp

Connecting to your AI Agents

Because it utilizes the open Model Context Protocol (MCP), integration is native and takes seconds.

1. Claude Code

Register the server globally via the Claude CLI:

Mac / Linux:

claude mcp add context-pilot-mcp context-pilot-mcp

Windows:

claude mcp add context-pilot-mcp python -m context_pilot.adapters.mcp_server

2. Cursor IDE

  1. Open Settings (Cmd/Ctrl + ,).
  2. Go to Features > MCP Servers and click + Add new MCP server.
  3. Set Name to context-pilot, Type to command, and Command to context-pilot-mcp. (On Windows, if the binary is not in PATH, use: python -m context_pilot.adapters.mcp_server)
  4. Save.

Provided MCP Tools

Once attached, your AI agent automatically gains the following capabilities:

  • read_file(path): The Primary Hook. Replaces standard file reading. Automatically tracks usage, counts tokens, and applies AST-compression if the file is stale.
  • optimize_context(budget, pressure): Imperative Enforcement. Evaluates context and provides the agent with a list of files it MUST drop or compress to stay within limits.
  • status(): Returns a tabular visualization of all loaded context, their tokens, turn ages, and staleness scores.
  • start_task(): Explicitly advances the turn counter for a new task.
  • drop_file(path): Notifies the engine that a file has been purged.

Configuration

You can tune the engine's behavior by placing a .context-pilot.toml file in the root of your project:

[context-pilot]
mode = "suggest"            # "suggest", "auto", or "off"
pressure_threshold = 0.8    # Trigger eviction when window is 80% full
stale_after_turns = 5       # Unreferenced turns before considering stale
edit_stickiness = 3         # Edited files stay in context for N extra turns
compress_before_drop = true # Try AST compression before full eviction
max_context_tokens = 128000 # Your model's context window size
min_staleness_to_flag = 0.6 # Only flag files that are at least 60% stale

License: MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

context_pilot_mcp-0.2.1.tar.gz (21.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

context_pilot_mcp-0.2.1-py3-none-any.whl (19.8 kB view details)

Uploaded Python 3

File details

Details for the file context_pilot_mcp-0.2.1.tar.gz.

File metadata

  • Download URL: context_pilot_mcp-0.2.1.tar.gz
  • Upload date:
  • Size: 21.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.4

File hashes

Hashes for context_pilot_mcp-0.2.1.tar.gz
Algorithm Hash digest
SHA256 6a46fb090c123a308982295327b0a127aa62a61950bb2219642743924672e991
MD5 fd7a8224f19001f7500c00bd94d11e6d
BLAKE2b-256 5014e23676b1c0dca97e6577bb50ae15728de39d7b087d031ef8925c58eac979

See more details on using hashes here.

File details

Details for the file context_pilot_mcp-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for context_pilot_mcp-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7c5e46e26e773ee019872485e433629c8e7503313f961cf443d34d0d565d1c19
MD5 a7ee3a2f41343c01f0fbae2597e90159
BLAKE2b-256 8dedcf9c50e4d88749e1e95a441788ddb3903e62ae8d0841d76620b7bd6b45bf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page