Skip to main content

Local context compiler for AI coding assistants — smallest correct context bundle with rationale

Project description

context-compiler

An MCP server that indexes your codebase into a dependency graph and returns the smallest correct context for any coding task: exact line ranges per symbol, with a rationale for each one.

Runs entirely on your machine. No cloud, no LLM API calls, no embeddings server, no internet connection required. Your source code and task descriptions never leave your laptop. The base install is lightweight: pure Python, no GPU, no heavy ML framework.


Getting started

pip install context-compiler-mcp
cd /your/project
context-compiler init

By default init registers with Claude Code. Pass --client to target a different assistant:

context-compiler init --client claude    # Claude Code (default)
context-compiler init --client gemini    # Gemini CLI
context-compiler init --client codex     # OpenAI Codex CLI

init indexes your codebase and registers the MCP server with your chosen assistant. For Claude Code it also adds instructions to CLAUDE.md so it calls get_context automatically before reading files.

Requires Python 3.11+.

Multi-repo projects

context-compiler init --dependencies ../repo1,../repo2
context-compiler init --client codex --dependencies ../repo1,../repo2

Each repo is indexed separately. get_context queries all graphs and returns the best-matching symbols across all repos. The dependency list is saved and picked up automatically on next start.

Other commands

context-compiler index                        # re-index after large changes
context-compiler explain --task "<prompt>"    # preview what context a task returns

Optional: semantic search

pip install "context-compiler-mcp[semantic]"

Enables embedding-based fallback for cases where task terms don't appear in symbol names (e.g. "fix login flow" finds authenticate_user). Downloads a 23MB ONNX model once, no PyTorch required.


How it works

Every step runs locally with no external calls:

  1. Classify the task: BUG_FIX, NEW_FEATURE, or REFACTOR (keyword scoring, no LLM)
  2. Find entry nodes via BM25 over symbol names, file paths, and docstrings
  3. Traverse the dependency graph with a strategy tuned per task type
  4. Score candidates, enforce a token budget, return slices with rationale

The graph is stored in an embedded KuzuDB database in your project folder. No server process, no port, no auth.

Same repo + same task = same output, every time.


Output

{
  "slices": [
    {
      "file_path": "/abs/path/payments/processor.py",
      "line_start": 6,
      "line_end": 24,
      "rationale": "Included PaymentProcessor as primary task location (matched 'payment')"
    },
    {
      "file_path": "/abs/path/payments/retry_handler.py",
      "line_start": 12,
      "line_end": 38,
      "rationale": "Included RetryHandler because it is called by PaymentProcessor (depth 1)"
    }
  ]
}

Each slice points to the specific function or class that's relevant. A 500-line file with one relevant function costs ~40 tokens, not 500.


Supported languages

Language Parsing
Python tree-sitter-python
TypeScript / TSX tree-sitter-typescript
JavaScript / JSX tree-sitter-javascript

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

context_compiler_mcp-1.1.0.tar.gz (64.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

context_compiler_mcp-1.1.0-py3-none-any.whl (34.4 kB view details)

Uploaded Python 3

File details

Details for the file context_compiler_mcp-1.1.0.tar.gz.

File metadata

  • Download URL: context_compiler_mcp-1.1.0.tar.gz
  • Upload date:
  • Size: 64.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.4

File hashes

Hashes for context_compiler_mcp-1.1.0.tar.gz
Algorithm Hash digest
SHA256 6256f9c13adaf1d7c26e811931758264c9945914a53e5f887a85885502c32a72
MD5 17dc431b73a1d8682569cab7cf55a303
BLAKE2b-256 e29f5c76b3a50fce5cb870b3d400c0397d1ee837e7979d7e167af9a8a0d4bc00

See more details on using hashes here.

File details

Details for the file context_compiler_mcp-1.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for context_compiler_mcp-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cd0654ffc8145144418ee83b4f7a13f136efdfe5a1912f3d2b7e8cce48a714cf
MD5 338b32f368cec092b435ea8cb7f21413
BLAKE2b-256 5ce5543ebfcc5ab844fd13a0fa73c37ea18fd3a3de60bb135d57afca36ccde03

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page