Skip to main content

MCP server that gives AI coding assistants real-time codebase understanding via Graph-sitter

Project description

Grafyx

PyPI CI License: MIT Python 3.12+ MCP

Real-time codebase understanding for AI coding assistants.


What is Grafyx?

AI coding tools read raw files with zero architectural understanding -- they don't know what calls what, which classes inherit from where, or how your modules connect. Grafyx fixes this by parsing your entire codebase into a full relationship graph using Graph-sitter (built on tree-sitter), then exposing that graph to any AI assistant through the Model Context Protocol (MCP). Your assistant can trace call chains, map dependencies, find related code by description, detect conventions, and understand your project's architecture -- all in real time, with a file watcher that keeps the graph current as you edit.


Quick Start

Claude Code

# Zero-install (recommended)
claude mcp add --scope user grafyx -- uvx --from grafyx-mcp grafyx

# Or install with pip first
pip install grafyx-mcp
claude mcp add --scope user grafyx -- grafyx

Cursor / Windsurf / Cline

Add to your MCP config file:

  • Cursor: .cursor/mcp.json (project) or ~/.cursor/mcp.json (global)
  • Windsurf: ~/.codeium/windsurf/mcp_config.json
  • Cline: Cline MCP settings in VS Code
{
  "mcpServers": {
    "grafyx": {
      "command": "uvx",
      "args": ["--from", "grafyx-mcp", "grafyx"]
    }
  }
}

VS Code (GitHub Copilot)

Add to .vscode/mcp.json:

{
  "servers": {
    "grafyx": {
      "command": "uvx",
      "args": ["--from", "grafyx-mcp", "grafyx"]
    }
  }
}

Using pip instead of uvx? Replace the command with: "command": "grafyx" (no args needed).


Available Tools

Tool Description
get_project_skeleton Full project structure with stats per module
get_function_context Everything about a function: callers, callees, deps
get_file_context File contents, imports, dependencies
get_class_context Class methods, inheritance, usages
find_related_code Natural language search across the codebase
find_related_files Find files relevant to a feature by matching symbols
get_dependency_graph Impact analysis: what depends on what
get_conventions Detected coding patterns and conventions
get_call_graph Call chain tracing upstream and downstream
refresh_graph Force re-parse of the codebase
get_module_context Symbols in a directory/package (intermediate zoom)
get_subclasses Inheritance tree for a base class
get_unused_symbols Dead code detection
set_project Switch the served project at runtime

How It Works

Your AI Assistant
       |
       | MCP Protocol (stdio)
       v
  +-----------+
  |  Grafyx   |  FastMCP server with 14 tools
  |  Server   |
  +-----------+
       |
  +-----------+     +-----------+     +-------------+
  |  Graph    |---->|  Search   |     | Convention  |
  |  Engine   |---->|  Engine   |     | Detector    |
  +-----------+     +-----------+     +-------------+
       |
       v
  +-----------+
  |  Graph-   |  Tree-sitter based parsing
  |  sitter   |
  +-----------+
       |
  +-----------+
  |  Watchdog |  File watcher for live updates
  +-----------+
  1. Startup -- Grafyx detects languages in your project and parses all source files into a semantic graph via Graph-sitter.
  2. Serving -- The FastMCP server exposes 14 tools over stdio. Your AI assistant calls them as needed.
  3. Live updates -- Watchdog monitors file changes. When you save, the graph is automatically re-parsed after a short debounce.

ML-augmented search

Grafyx ships several small numpy-only MLPs trained on real source data:

  • M1 Relevance ranker -- 33-feature MLP scores each search result against the query.
  • M3 Source token filter -- suppresses noise tokens (imports, strings, magic methods) from full-text search.
  • M4 Symbol importance -- weights symbols by caller count, exports, and structural signals.
  • M5 Bi-encoder -- semantic embedding model (BPE tokenizer, FeedForward encoder) for natural-language code search.
  • Gibberish detector -- character-bigram MLP that blocks nonsense queries before they hit the index.

All weights ship inside the wheel (~11 MB total). Inference is pure numpy, no PyTorch at runtime.


Supported Languages

Language Extensions
Python .py, .pyi
TypeScript .ts, .tsx
JavaScript .js, .jsx

Languages are auto-detected. To specify manually:

grafyx --languages python,typescript

Options

grafyx [OPTIONS]

  --project PATH       Project to analyze (default: current directory)
  --languages LANGS    Comma-separated languages (default: auto-detect)
  --ignore PATTERNS    Additional directories to ignore
  --no-watch           Disable file watching
  --verbose, -v        Debug logging
  --version            Show version

Default ignored: node_modules, .git, __pycache__, .venv, venv, .env, dist, build, .tox, .mypy_cache, .pytest_cache, .ruff_cache, egg-info, .eggs, .next, .nuxt, coverage, .coverage, .nyc_output


Multi-Agent Support

Grafyx works with agent teams. A single Grafyx instance serves all agents connected to the same project. When one agent modifies code, the file watcher updates the graph automatically, so other agents immediately see the changes.


Contributing

git clone https://github.com/bilal07karadeniz/Grafyx.git
cd Grafyx
pip install -e ".[dev]"
pytest

Troubleshooting

Windows: Graph-sitter requires Linux. Use WSL and configure your MCP client to launch via wsl:

{
  "mcpServers": {
    "grafyx": {
      "command": "wsl",
      "args": ["-e", "bash", "-c", "source ~/your-venv/bin/activate && grafyx"]
    }
  }
}

License

MIT -- see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grafyx_mcp-0.1.0.tar.gz (11.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

grafyx_mcp-0.1.0-py3-none-any.whl (9.7 MB view details)

Uploaded Python 3

File details

Details for the file grafyx_mcp-0.1.0.tar.gz.

File metadata

  • Download URL: grafyx_mcp-0.1.0.tar.gz
  • Upload date:
  • Size: 11.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for grafyx_mcp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 84419c21270e7d402f99cb9e3914af2503616b1e947171384b1d517da1a34458
MD5 c05aaed058fc8cb92cdbbdf7207c9b3c
BLAKE2b-256 44a071692d4e2d0292904c7b134b03469ef352dd0db03f771f38180e5d055d67

See more details on using hashes here.

Provenance

The following attestation bundles were made for grafyx_mcp-0.1.0.tar.gz:

Publisher: release.yml on bilal07karadeniz/Grafyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file grafyx_mcp-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: grafyx_mcp-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 9.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for grafyx_mcp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 04582c94db49508fd56066993e6a9486039ff27732373f8faa454a132b7e9e00
MD5 1e17ba104874cb20224e7ee99ecc67ae
BLAKE2b-256 fc43a9db62731fbfd1795fd974d48e5342939e05048ff584e85084a593d8f434

See more details on using hashes here.

Provenance

The following attestation bundles were made for grafyx_mcp-0.1.0-py3-none-any.whl:

Publisher: release.yml on bilal07karadeniz/Grafyx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page