Skip to main content

Local semantic code search MCP server for Claude Code

Project description

claude-context-local

PyPI Python License: MIT

Local semantic code search MCP server for Claude Code. Zero external APIs — runs entirely on your machine.

A lightweight alternative to zilliztech/claude-context that uses local embeddings instead of OpenAI + Zilliz Cloud.

Features

  • 100% local — no API keys, no cloud, no data leaves your machine
  • sentence-transformers (all-MiniLM-L6-v2, 88 MB) for embeddings
  • ChromaDB for persistent vector storage
  • Per-project isolation — each project gets its own index
  • Incremental indexing — only re-indexes changed files (MD5 hash)
  • 40+ file types supported out of the box

Quick start

claude mcp add claude-context-local -- uvx claude-context-local

That's it. Restart Claude Code and the tools are available.

Alternative: pip

pip install claude-context-local
claude mcp add claude-context-local -- claude-context-local

Alternative: from source

git clone https://github.com/tazhate/claude-context-local.git
cd claude-context-local
pip install -e .
claude mcp add claude-context-local -- claude-context-local

MCP Tools

Tool Description
index_project(project_path) Index a codebase. Incremental by default, force=True to rebuild.
search_code(query, project_path) Semantic search. Optional n_results and file_filter (e.g. "*.py").
index_status(project_path) Show indexed files/chunks count.
drop_index(project_path) Remove project index.

Usage

Once connected, Claude Code will automatically use these tools. You can also ask directly:

  • "Index this project" — triggers index_project with current working directory
  • "Search for authentication logic" — triggers search_code
  • "How many files are indexed?" — triggers index_status

How it works

┌─────────────┐     ┌──────────────────┐     ┌──────────┐
│ Claude Code  │────▸│ claude-context-   │────▸│ ChromaDB │
│  (MCP client)│◂────│ local (MCP server)│◂────│ (vectors)│
└─────────────┘     └──────────────────┘     └──────────┘
                            │
                    ┌───────┴────────┐
                    │ sentence-      │
                    │ transformers   │
                    │ (embeddings)   │
                    └────────────────┘
  1. Index: Walk project files → split into overlapping chunks (50 lines, 10 overlap) → embed with sentence-transformers → store in ChromaDB
  2. Search: Embed query → cosine similarity search in ChromaDB → return ranked code snippets with file paths and line numbers
  3. Incremental updates: MD5 hash per file — only changed files are re-embedded

Per-project isolation

Each project gets its own ChromaDB database under ~/.cache/claude-context-local/<hash>/, where <hash> is derived from the absolute project path. Projects never mix.

Configuration

Environment variables (pass via claude mcp add -e KEY=VALUE):

Variable Default Description
CCL_MODEL all-MiniLM-L6-v2 sentence-transformers model
CCL_CHUNK_LINES 50 Max lines per chunk
CCL_CHUNK_OVERLAP 10 Overlap lines between chunks
CCL_DATA_DIR ~/.cache/claude-context-local Index storage directory

Custom model example

claude mcp add claude-context-local \
  -e CCL_MODEL=jinaai/jina-embeddings-v2-base-code \
  -- uvx claude-context-local

Resource usage

Resource Value
RAM ~780 MB (PyTorch + model in memory)
Model on disk 88 MB (downloaded once)
Index size ~27 MB per 500 files
CPU Near zero at idle
First index ~2 min for 500 files (CPU)
Incremental Seconds (only changed files)

Supported file types

Code: .py .go .js .ts .tsx .jsx .rs .java .kt .c .cpp .h .hpp .cs .rb .php .swift .scala .sh .bash .lua .zig .nim .ex .exs .erl .nix

Config: .yaml .yml .toml .json .hcl .tf .sql .graphql .proto

Docs: .md .txt .rst

Web: .html .css .scss .less

Other: Dockerfile, Makefile

Comparison with zilliztech/claude-context

claude-context-local zilliztech/claude-context
Embeddings Local (sentence-transformers) OpenAI API
Vector DB Local (ChromaDB) Zilliz Cloud
API keys needed None OpenAI + Zilliz
Data privacy 100% local Cloud
Setup One command Multiple API keys
Cost Free Pay per use
Search quality Good Better (larger models)
RAM usage ~780 MB ~50 MB (Node.js)

Development

git clone https://github.com/tazhate/claude-context-local.git
cd claude-context-local
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
pytest

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

claude_context_local-0.1.1-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file claude_context_local-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for claude_context_local-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8a9c3796f6b82e558af74ee17e77e27bda664c53fb2f6d67fd35c915ee57c9bd
MD5 94062ca95287ac82d18a8e0c702b4e7e
BLAKE2b-256 6b5c3d731f696d11531aa41f7bc8f5170be90df0a527d1905b52dd7ccd8890ff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page