Skip to main content

A zero-dependency, in-memory graph database providing a memory layer for local AI agents with AST-based dependency tracking and Token-Level Provenance via Source Maps.

Project description

MCP Context Graph

A self-contained, in-memory graph database for AI Agents. Provides semantic code understanding through the Model Context Protocol (MCP).

Why This Tool?

AI coding assistants often struggle with large codebases. They either:

  • Read entire files (expensive, hits context limits)
  • Grep for text (misses semantic relationships)
  • Lose track of where functions are called from

MCP Context Graph solves this by building a semantic graph of your codebase that the AI can query efficiently.

Key Differentiators

Feature Benefit
Token-Level Source Maps Stores minified signatures but can expand to exact original source with character-accurate mapping
Semantic Call Graph "Who calls this function?" answered in milliseconds, not by reading every file
Polyglot Engine Python, TypeScript, JavaScript parsed with tree-sitter grammars
Zero Install Run instantly with uvx - no pip install, no dependencies to manage
MCP Native Built for AI agents - exposes tools through Model Context Protocol
Smart Exclusions Respects .gitignore, skips node_modules, .venv, __pycache__ automatically
Lazy Ingestion Files indexed on-demand, auto-refreshed when modified

How It Works

Your Codebase                    MCP Context Graph                AI Agent
     |                                  |                              |
     |  ──── tree-sitter parse ────>    |                              |
     |                                  |                              |
     |  <── minified signatures ────    |                              |
     |      + source maps               |                              |
     |                                  |                              |
     |                                  |  <── find_callers("fn") ──   |
     |                                  |  ──> [caller1, caller2] ───> |
     |                                  |                              |
     |                                  |  <── expand_source(id) ────  |
     |                                  |  ──> exact original code ─>  |

The AI gets fast semantic queries without loading entire files. When it needs the full source, it can expand specific symbols using source maps.

Features

  • Polyglot Support: Parses Python, TypeScript, and JavaScript using tree-sitter
  • Source Maps: Token-level provenance for precise context extraction
  • Lazy Ingestion: Files are indexed on-demand and refreshed automatically
  • Zero Configuration: Works instantly via uvx with sensible defaults

Installation

Primary Method (Recommended)

No installation required. Run directly with uvx:

uvx mcp-context-graph /path/to/your/project

Alternative: Install via pip/uv

# Using uv
uv pip install mcp-context-graph

# Using pip
pip install mcp-context-graph

How to Use

MCP Context Graph runs as an MCP server that AI assistants can connect to. Configure it in your MCP client:

A. In Cline (VS Code)

Add to your Cline MCP settings (cline_mcp_settings.json):

{
  "mcpServers": {
    "context-graph": {
      "command": "uvx",
      "args": ["mcp-context-graph", "."],
      "autoApprove": []
    }
  }
}

The . argument uses the current workspace directory as the project root.

B. In Claude Desktop

Add to your Claude Desktop configuration (claude_desktop_config.json):

{
  "mcpServers": {
    "context-graph": {
      "command": "uvx",
      "args": ["mcp-context-graph", "/absolute/path/to/your/project"]
    }
  }
}

Note: Claude Desktop requires an absolute path. Relative paths like . will not work correctly.

Available Tools

Once connected, the following tools are available to the AI agent:

Tool Description
index_project Full scan of project directory. Builds the code graph.
find_symbol Find function/class definitions by name.
find_callers Find all locations that call a specific function.
get_context Get a context window around a symbol (callers, callees).
expand_source De-minify a node using source maps for full original code.
debug_dump_graph Export the graph as Mermaid, JSON, or DOT format.

Example Workflow

  1. AI indexes the project:

    index_project()
    
  2. AI finds a function definition:

    find_symbol(name="calculate_tax", include_calls=true)
    
  3. AI explores what calls that function:

    find_callers(name="calculate_tax")
    
  4. AI gets broader context (2 levels of connections):

    get_context(name="calculate_tax", depth=2, format="markdown")
    
  5. AI expands a specific symbol to see full source:

    expand_source(symbol_id="abc123")
    

Source Maps: The Secret Sauce

Most code indexers store either:

  • Full source code (expensive)
  • Just symbol names (loses context)

MCP Context Graph stores minified signatures with character-accurate source maps:

# Original (45 bytes)
def calculate_tax(amount: float, rate: float) -> float:
    """Calculate tax for the given amount."""
    return amount * rate

# Stored signature (minified)
def calculate_tax(amount: float, rate: float) -> float: ...

# Source map
Segment(minified: 0-52, original: 0-52)  # signature preserved exactly

When the AI needs the full implementation, expand_source maps the minified offsets back to the original file and returns exact source code.

Benchmarks

Benchmark Results

Metric Value
Files processed 46
Nodes created 624
Raw source size 341.2 KB
Minified size 24.6 KB
Compression ratio 13.9x
Ingest time 46 ms
find_definition 5 μs
find_callers 9 μs
get_context(depth=2) 5 μs

Cost Efficiency (USD per 1,000 calls)

Token counts via OpenRouter API (exact)

Model Full File Graph Savings %
gpt-5.2 $184.05 $15.40 $168.65 91.6%
gpt-4o-mini $11.04 $0.92 $10.12 91.6%
claude-sonnet-4.5 $281.17 $25.41 $255.76 91.0%
gemini-2.5-pro $320.24 $28.78 $291.46 91.0%

Key takeaways:

  • The AI can work with a 14x smaller representation of your codebase
  • Queries complete in microseconds, not seconds
  • 91-93% cost savings on context token usage across all major LLM providers
  • Full source is always available on-demand via source maps

Run benchmarks on your own project:

OPENROUTER_API_KEY=your-key uv run python benchmarks/run_benchmarks.py /path/to/project

CLI Usage

# Index current directory
uvx mcp-context-graph .

# Index a specific project
uvx mcp-context-graph /path/to/project

# Show version
uvx mcp-context-graph --version

Development

Prerequisites

  • Python 3.12+
  • uv package manager

Setup

# Clone the repository
git clone https://github.com/padobrik/mcp-context-graph.git
cd mcp-context-graph

# Install dependencies
uv sync

# Run tests
uv run pytest tests/

# Run linting
uv run ruff check .

# Run type checking
uv run mypy src/

Project Structure

src/mcp_context_graph/
  core/           # Graph data structures (Node, Edge, Graph)
  ingest/         # File parsing and graph construction
  languages/      # Language-specific configurations (Python, TypeScript)
  mcp/            # MCP server and tool handlers
  provenance/     # Source map implementation

License

MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_context_graph-0.1.0.tar.gz (148.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_context_graph-0.1.0-py3-none-any.whl (64.1 kB view details)

Uploaded Python 3

File details

Details for the file mcp_context_graph-0.1.0.tar.gz.

File metadata

  • Download URL: mcp_context_graph-0.1.0.tar.gz
  • Upload date:
  • Size: 148.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mcp_context_graph-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1e4522a13d9b2fe2d1f0ac71cae2127a4978d703d7a28139c1fde6fffa7f599d
MD5 f2b308cb79eb268ea0f8bb455f91490b
BLAKE2b-256 f8a0011e3db2b1b071ad61a8f6c2805d2fd7b1834c9c72ef8fe120a47ff96ae7

See more details on using hashes here.

File details

Details for the file mcp_context_graph-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_context_graph-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0320b4751a8914ad7c15cd420335da2cfed405a9b29cdb7accf71257c44fce62
MD5 f21296c48ed7c63308aabfc2383c5ffe
BLAKE2b-256 e3abc009a31b408b50f9a43063cc121d453646c05e13a3f0a0f4b1f607ff26c9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page