A Python MCP server that gives LLMs persistent, searchable access to project context — documentation, architecture decisions, and session notes.
Project description
MCP Project Context Server
A Python MCP server that gives LLMs persistent, searchable access to project context — documentation, architecture decisions, and session notes.
📖 About the Server
MCP Project Context Server provides a robust, production-ready Model Context Protocol (MCP) server implementation designed to give Large Language Models (LLMs) persistent, searchable access to your project's contextual information.
Core Capabilities
- 🔍 Semantic Search Engine: Query your project documentation using natural language
- 📚 Persistent Knowledge Base: Store and retrieve information from
.context/directory structure - 🏗️ Modular Architecture: Clean 4-layer design following SOLID principles
- 🎯 ADR Integration: Full support for Architecture Decision Records with lifecycle management
- 📝 Session Tracking: Record and retrieve session notes for future reference
- 💾 Vector Store Backend: ChromaDB for fast, persistent, embedded vector storage
- 🔄 Easy Reindexing: Rebuild your knowledge base with a single command
Key Features
- ✅ Model-Agnostic: Works with any LLM model via Ollama (Other providers coming)
- ✅ Configuration-Free: Environment variable-based setup, no hardcoded paths
- ✅ Cross-Platform: POSIX path normalization ensures consistency across OS
- ✅ Async-First: All operations use async/await for performance and scalability
- ✅ Error-Resilient: Graceful error handling with informative messaging
🚀 Getting Started
Prerequisites
Before installing, ensure you have:
- Python 3.11+ installed
- Ollama running with an embedding model (e.g.,
nomic-embed-text) - At least 2GB RAM available
- 4.5GB disk space for ChromaDB (minimum)
Installation
Option 1: PyPI (Recommended)
pip install mcp-project-context-server
Option 2: From Source
git clone https://github.com/your-org/mcp-project-context-server.git
cd mcp-project-context-server
pip install -e ".[dev]" # Install with development dependencies
Configuration
Set the following environment variables:
# Ollama Configuration (Required)
export OLLAMA_HOST="http://localhost:11434"
export EMBED_MODEL="nomic-embed-text"
# ChromaDB Configuration (Optional)
export CHROMA_DIR="$HOME/.mcp-data/chroma"
# Runtime Configuration
export EMBED_CONCURRENCY="4" # Max concurrent embeddings
export PROJECT_PATH="/path/to/project" # Optional, defaults to CWD
🖥️ Client Setup
Universal MCP Client Integration
The server follows the standard MCP protocol, making it compatible with any MCP client that supports stdio transport.
Supported MCP Clients
| Client | Status | Setup Instructions |
|---|---|---|
| Claude Desktop | ✅ Tested | See Claude Desktop Setup |
| Claude Code | ✅ Tested | See Claude Code Setup |
| Cursor | ✅ Tested | See Cursor Setup |
| Continue | ✅ Tested | See Continue Setup |
| Windsurf | ✅ Compatible | See Windsurf Setup |
| VS Code Copilot | ✅ Compatible | See VS Code Copilot Setup |
Claude Desktop Setup
-
Install the server (see Installation)
-
Locate the config file for your OS:
OS Config File Location Windows %APPDATA%\Claude\claude_desktop_config.jsonmacOS ~/Library/Application Support/Claude/claude_desktop_config.jsonLinux ~/.config/Claude/claude_desktop_config.json -
Configure MCP settings in
claude_desktop_config.json:Windows:
{ "mcpServers": { "project-context": { "command": "python", "args": ["-m", "mcp_project_context_server"], "env": { "OLLAMA_HOST": "http://localhost:11434", "EMBED_MODEL": "nomic-embed-text", "CHROMA_DIR": "%USERPROFILE%\\.mcp-data\\chroma" } } } }
macOS / Linux:
{ "mcpServers": { "project-context": { "command": "python", "args": ["-m", "mcp_project_context_server"], "env": { "OLLAMA_HOST": "http://localhost:11434", "EMBED_MODEL": "nomic-embed-text", "CHROMA_DIR": "~/.mcp-data/chroma" } } } }
-
Verify the server is connected:
claude mcp list
-
Use in Claude Code by referencing the tools directly in your session, or asking questions about your project context.
- Try asking: "What was the decision in ADR-00001?"
- Verify semantic search works with project-specific queries
Claude Code Setup
-
Install the server (see Installation)
-
Add the MCP server using one of two methods:
Option A — CLI (Recommended):
claude mcp add project-context python -- -m mcp_project_context_server
To include environment variables:
claude mcp add project-context \ -e OLLAMA_HOST=http://localhost:11434 \ -e EMBED_MODEL=nomic-embed-text \ -e CHROMA_DIR=~/.mcp-data/chroma \ -- python -m mcp_project_context_server
Option B — Config file:
Claude Code supports both user-level and project-level configuration:
Scope Location User (global) ~/.claude.jsonProject .claude/settings.json(in project root)Add the following to the
mcpServerskey:{ "mcpServers": { "project-context": { "command": "python", "args": ["-m", "mcp_project_context_server"], "env": { "OLLAMA_HOST": "http://localhost:11434", "EMBED_MODEL": "nomic-embed-text", "CHROMA_DIR": "~/.mcp-data/chroma" } } } }
-
Verify the server is connected:
-
Use in Claude Code by referencing the tools directly in your session, or asking questions about your project context.
Cursor Setup
-
Install the MCP server (see Installation)
-
Choose a config scope — Cursor supports both global and project-level MCP configuration:
Scope Windows macOS / Linux Global %USERPROFILE%\.cursor\mcp.json~/.cursor/mcp.jsonProject .cursor\mcp.json(in project root).cursor/mcp.json(in project root) -
Configure in
mcp.json:{ "mcpServers": { "project-context": { "command": "python", "args": [ "-m", "mcp_project_context_server" ], "env": { "OLLAMA_HOST": "http://localhost:11434", "EMBED_MODEL": "nomic-embed-text", "CHROMA_DIR": "~/.mcp-data/chroma" } } } }
-
Test functionality:
- Use
@project-contextin chat - Ask context-aware questions about your project
- Access ADRs and documentation via natural language
- Use
Continue Setup
-
Install the Continue VS Code or JetBrains extension
-
Locate the config file for your OS:
OS Config File Location Windows %USERPROFILE%\.continue\config.yamlmacOS / Linux ~/.continue/config.yamlContinue also supports
config.jsonfor legacy setups, butconfig.yamlis the current default. -
Add to
config.yaml:mcpServers: - name: project-context command: python args: - "-m" - mcp_project_context_server env: OLLAMA_HOST: "http://localhost:11434" EMBED_MODEL: "nomic-embed-text" CHROMA_DIR: "~/.mcp-data/chroma"
Or if using
config.json:{ "mcpServers": [ { "name": "project-context", "command": "python", "args": ["-m", "mcp_project_context_server"], "env": { "OLLAMA_HOST": "http://localhost:11434", "EMBED_MODEL": "nomic-embed-text", "CHROMA_DIR": "~/.mcp-data/chroma" } } ] }
-
Usage:
- Trigger context queries in the chat panel
- Access project documentation mid-conversation
- Maintain context across multi-turn conversations
Windsurf Setup
-
Install MCP server via terminal or package manager
-
Locate the MCP config file for your OS:
OS Config File Location Windows %USERPROFILE%\.codeium\windsurf\mcp_config.jsonmacOS / Linux ~/.codeium/windsurf/mcp_config.json -
Configure in
mcp_config.json(create if not exists):{ "mcpServers": { "project-context": { "command": "python", "args": ["-m", "mcp_project_context_server"], "env": { "OLLAMA_HOST": "http://localhost:11434", "EMBED_MODEL": "nomic-embed-text", "CHROMA_DIR": "~/.mcp-data/chroma" } } } }
-
Restart Windsurf and verify the MCP server appears under
Settings → MCP Servers.
VS Code Copilot Setup
MCP support is built into VS Code via GitHub Copilot (no separate extension required). Requires VS Code 1.99+ with the Copilot extension.
-
Install the server (see Installation)
-
Choose a config scope:
Option A — Workspace (
.vscode/mcp.json):Create
.vscode/mcp.jsonin your project root (works identically on all OSes):{ "servers": { "project-context": { "type": "stdio", "command": "python", "args": ["-m", "mcp_project_context_server"], "env": { "OLLAMA_HOST": "http://localhost:11434", "EMBED_MODEL": "nomic-embed-text", "CHROMA_DIR": "${env:USERPROFILE}/.mcp-data/chroma" } } } }
Note: Use
${env:USERPROFILE}/.mcp-data/chromaon Windows or~/.mcp-data/chromaon macOS/Linux forCHROMA_DIR.Option B — User settings (
settings.json):Open VS Code settings (
Ctrl + ,/Cmd + ,) and add tosettings.json:{ "mcp": { "servers": { "project-context": { "type": "stdio", "command": "python", "args": ["-m", "mcp_project_context_server"], "env": { "OLLAMA_HOST": "http://localhost:11434", "EMBED_MODEL": "nomic-embed-text", "CHROMA_DIR": "~/.mcp-data/chroma" } } } } }
-
Use in Copilot Chat by switching to Agent mode and the MCP tools will be available automatically.
IDE-Specific Best Practices
PyCharm
While PyCharm doesn't natively support MCP, you can:
-
Use the CLI mode:
Windows (PowerShell):
project-context-server search "your query"
macOS / Linux:
project-context-server search "your query"
-
Or use the Python interpreter:
from mcp_project_context_server.server import run run() # Start server, then connect via MCP client
Vim/Neovim (with mcp.nvim)
-- In your Neovim config (works on Windows, macOS, and Linux)
require('mcp').connect({
name = 'project-context',
command = 'python',
args = {'-m', 'mcp_project_context_server'},
env = {
OLLAMA_HOST = 'http://localhost:11434'
}
})
Sublime Text (with Sublime MCP)
Similar to VS Code, configure in Sublime's MCP settings file with the same JSON structure.
🛠️ Usage Examples
Semantic Search
Ask natural language questions about your project:
# Example: Ask about your project's architecture
# Expected: Retrieves relevant ADRs and documentation
search_project_context(
query="How do we handle data persistence?",
n_results=5
)
Load Full Context
Get all documentation at once:
load_project_context()
# Returns concatenated content of:
# - project.md
# - All ADRs
# - Latest session file
Save Session Notes
save_session_summary(
summary="Investigated chunking strategy alternatives, decided on fixed-size for now"
)
# Creates: .context/sessions/YYYY-MM-DD.md
Rebuild Index
index_project_context()
# Drops existing collection and rebuilds from .context/
📂 Project Structure
mcp-project-context-server/
├── src/mcp_project_context_server/
│ ├── __init__.py
│ ├── __main__.py
│ ├── server.py # MCP server entry point
│ ├── tools/
│ │ ├── load_context.py # load_project_context tool
│ │ ├── search_context.py # search_project_context tool
│ │ ├── save_session.py # save_session_summary tool
│ │ └── index_context.py # index_project_context tool
│ ├── integrations/
│ │ ├── chroma/
│ │ │ └── client.py # ChromaDB client
│ │ └── ollama/
│ │ └── client.py # Ollama client
│ ├── indexing/
│ │ ├── chroma/
│ │ │ └── indexer.py # Chunking & embedding pipeline
│ │ └── ollama/
│ │ └── embedder.py # Embedding wrappers
│ └── helpers/
│ └── context.py # Utility functions
├── .context/ # Project context directory
│ ├── project.md # Project overview
│ ├── sessions/ # Session notes
│ └── decisions/ # ADRs
├── scripts/
│ └── test_client.py # Integration smoke test
├── README.md
├── pyproject.toml
└── LICENSE
🧪 Testing
Manual Integration Test
python scripts/test_client.py
Development Workflow
# Run pytest
pytest tests/
# Test coverage
pytest --cov=src/mcp_project_context_server
# Lint and format
ruff check src/
black src/
🌐 Environment Variables Reference
| Variable | Default | Description |
|---|---|---|
OLLAMA_HOST |
http://localhost:11434 |
Ollama server URL |
EMBED_MODEL |
nomic-embed-text |
Embedding model name |
CHROMA_DIR |
~/.mcp-data/chroma |
ChromaDB persistence directory |
EMBED_CONCURRENCY |
4 |
Max concurrent embedding requests |
PROJECT_PATH |
CWD | Path to project root (optional) |
MCP_TOOL_PREFIX |
project-context- |
Prefix for tool names |
🔮 Roadmap
Planned Features
- Auto-reindex: Watchdog-based file monitoring for automatic reindexing
- YAML Configuration: Replace environment variables with config files
- Codebase Indexing: Repomix integration for source code analysis
- Enhanced ADR Tools: First-class MCP tools for ADR lifecycle
- Repository Bootstrapping: Automatic
.context/generation - Batch Operations: Bulk ADR updates and session imports
- API Endpoint: HTTP REST API alternative to MCP tools
Community Contributions
Contributions are welcome! See CONTRIBUTING.md for guidelines.
📜 Architecture Decision Records
Explore our ADRs to understand architectural decisions:
- ADR-00001: MCP Protocol Selection
- ADR-00002: ChromaDB as Vector Store
- ADR-00003: Embedding Provider Selection
- ADR-00004: Modular Tool Architecture
- ADR-00005: Environment Configuration
- ADR-00006: Drop & Recreate Indexing
- ADR-00007: Chunking Strategy
- ADR-00008: POSIX Path Normalization
- ADR-00009: Repomix Integration
- ADR-00010: ADR Tooling
- ADR-00011: Repository Bootstrapping
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🤝 Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Open a Pull Request
See CONTRIBUTING.md for detailed contribution guidelines.
🙏 Acknowledgments
- MCP Team: For the Model Context Protocol
- ChromaDB: For the vector store implementation
- Ollama: For the embedding model hosting
Built with ❤️ for better LLM project understanding
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcp_project_context_server-0.0.7.tar.gz.
File metadata
- Download URL: mcp_project_context_server-0.0.7.tar.gz
- Upload date:
- Size: 57.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2aca9da18b7e5774472073b4811cd185f60be31fa6521b82b383ab3396d18f44
|
|
| MD5 |
8f197cbb6011def295128f7c43f6cbbe
|
|
| BLAKE2b-256 |
bc7d0ec00cc215864f8dc2e612163e050d372d2ca98d63c3cf48dee146e52f07
|
Provenance
The following attestation bundles were made for mcp_project_context_server-0.0.7.tar.gz:
Publisher:
build-and-publish.yml on DarkMatterProductions/mcp-project-context-server
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcp_project_context_server-0.0.7.tar.gz -
Subject digest:
2aca9da18b7e5774472073b4811cd185f60be31fa6521b82b383ab3396d18f44 - Sigstore transparency entry: 1338934079
- Sigstore integration time:
-
Permalink:
DarkMatterProductions/mcp-project-context-server@7ee4539ee0cca95fe51b671ee2ef6839a24e22f8 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/DarkMatterProductions
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
build-and-publish.yml@7ee4539ee0cca95fe51b671ee2ef6839a24e22f8 -
Trigger Event:
push
-
Statement type:
File details
Details for the file mcp_project_context_server-0.0.7-py3-none-any.whl.
File metadata
- Download URL: mcp_project_context_server-0.0.7-py3-none-any.whl
- Upload date:
- Size: 41.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a07d0391bc8ab559807547e1c90455696f66e38799daadc7a3ebd9b3550fc14b
|
|
| MD5 |
5767f4080f4b8cf509f0afc78864045c
|
|
| BLAKE2b-256 |
515c362443da0cf09565b463e0a08de2c84a8e09d5f48b0fa03a7b34ea0dc44b
|
Provenance
The following attestation bundles were made for mcp_project_context_server-0.0.7-py3-none-any.whl:
Publisher:
build-and-publish.yml on DarkMatterProductions/mcp-project-context-server
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcp_project_context_server-0.0.7-py3-none-any.whl -
Subject digest:
a07d0391bc8ab559807547e1c90455696f66e38799daadc7a3ebd9b3550fc14b - Sigstore transparency entry: 1338934084
- Sigstore integration time:
-
Permalink:
DarkMatterProductions/mcp-project-context-server@7ee4539ee0cca95fe51b671ee2ef6839a24e22f8 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/DarkMatterProductions
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
build-and-publish.yml@7ee4539ee0cca95fe51b671ee2ef6839a24e22f8 -
Trigger Event:
push
-
Statement type: