MCP server for persistent AI coding assistant memory with local RAG
Project description
Nexus-Dev
Persistent Memory for AI Coding Agents
Nexus-Dev is an open-source MCP (Model Context Protocol) server that provides a local RAG (Retrieval-Augmented Generation) system for AI coding assistants like GitHub Copilot, Cursor, and Windsurf. It learns from your codebase and mistakes, enabling cross-project knowledge sharing.
Features
- 🧠 Persistent Memory: Index your code and documentation for semantic search
- 📚 Lesson Learning: Record problems and solutions that the AI can recall later
- 🌐 Multi-Language Support: Python, JavaScript/TypeScript, Java (extensible via tree-sitter)
- 📖 Documentation Indexing: Parse and index Markdown/RST documentation
- 🔄 Cross-Project Learning: Share knowledge across all your projects
- 🏠 Local-First: All data stays on your machine with LanceDB
Installation
# Using pip
pip install nexus-dev
# Using uv (recommended)
uv pip install nexus-dev
Quick Start
1. Initialize a Project
cd your-project
nexus-init --project-name "my-project" --embedding-provider openai
This creates:
nexus_config.json- Project configuration.nexus/lessons/- Directory for learned lessons
2. Set Your API Key (OpenAI only)
The CLI commands require the API key in your environment:
export OPENAI_API_KEY="sk-..."
Tip: Add this to your shell profile (
~/.zshrc,~/.bashrc) so it's always available.If using Ollama, no API key is needed—just ensure Ollama is running locally.
3. Index Your Code
# Index directories recursively (recommended)
nexus-index src/ -r
# Index multiple directories
nexus-index src/ docs/ -r
# Index specific files (no -r needed)
nexus-index main.py utils.py
Note: The
-rflag is required to recursively index subdirectories. Without it, only files directly inside the given folder are indexed.
4. Configure Your AI Agent
Add to your MCP client configuration (e.g., Claude Desktop):
{
"mcpServers": {
"nexus-dev": {
"command": "nexus-dev",
"args": []
}
}
}
5. Verify Your Setup
Check indexed content via CLI:
nexus-status
Test in your AI agent — copy and paste this prompt:
Search the Nexus-Dev knowledge base for functions related to "embeddings"
and show me the project statistics.
If the AI uses the search_code or get_project_context tools and returns results, your setup is complete! 🎉
MCP Tools
Nexus-Dev exposes 7 tools to AI agents:
Search Tools
| Tool | Description |
|---|---|
search_knowledge |
Search all content (code, docs, lessons) with optional content_type filter |
search_code |
Search specifically in indexed code (functions, classes, methods) |
search_docs |
Search specifically in documentation (Markdown, RST, text) |
search_lessons |
Search in recorded lessons (problems & solutions) |
Indexing Tools
| Tool | Description |
|---|---|
index_file |
Index a file into the knowledge base |
record_lesson |
Store a problem/solution pair for future reference |
get_project_context |
Get project statistics and recent lessons |
MCP Gateway Mode
Nexus-Dev can act as a gateway to other MCP servers, reducing tool count for AI agents.
Setup
-
Initialize MCP configuration:
nexus-mcp init --from-global
-
Index tools from configured servers:
nexus-index-mcp --all
Usage
Instead of configuring 10 MCP servers (50+ tools), configure only Nexus-Dev:
{
"mcpServers": {
"nexus-dev": {
"command": "nexus-dev"
}
}
}
AI uses these Nexus-Dev tools to access other servers:
| Tool | Description |
|---|---|
search_tools |
Find the right tool for a task |
invoke_tool |
Execute a tool on any configured server |
list_servers |
Show available MCP servers |
Workflow
- AI searches:
search_tools("create GitHub issue") - Nexus-Dev returns:
github.create_issuewith schema - AI invokes:
invoke_tool("github", "create_issue", {...}) - Nexus-Dev proxies to GitHub MCP
Server Configuration
You can configure downstream MCP servers in .nexus/mcp_config.json using either Stdio (local process) or SSE (HTTP remote) transports.
Local Server (Stdio):
{
"servers": {
"github-local": {
"transport": "stdio",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "..."
}
}
}
}
Remote Server (SSE):
{
"servers": {
"github-remote": {
"transport": "sse",
"url": "https://api.githubcopilot.com/mcp/",
"headers": {
"Authorization": "Bearer ..."
}
}
}
}
Configuration
nexus_config.json example:
{
"project_id": "550e8400-e29b-41d4-a716-446655440000",
"project_name": "my-project",
"embedding_provider": "openai",
"embedding_model": "text-embedding-3-small",
"docs_folders": ["docs/", "README.md"],
"include_patterns": ["**/*.py", "**/*.js", "**/*.java"],
"exclude_patterns": ["**/node_modules/**", "**/__pycache__/**"]
}
📖 See docs/adding-mcp-servers.md for a guide on adding custom MCP servers.
Supported Embedding Providers
Nexus-Dev supports multiple embedding providers. Choose the one that best fits your needs.
1. OpenAI (Default)
- Best for: General purpose, ease of use.
- Provider:
openai - Default Model:
text-embedding-3-small - Configuration:
{ "embedding_provider": "openai", "embedding_model": "text-embedding-3-small" }
- Environment: Set
OPENAI_API_KEY.
2. Local Ollama (Privacy / Offline)
- Best for: Privacy, local execution, cost savings.
- Provider:
ollama - Default Model:
nomic-embed-text - Configuration:
{ "embedding_provider": "ollama", "embedding_model": "nomic-embed-text", "ollama_url": "http://localhost:11434" }
3. Google Vertex AI (Enterprise)
- Best for: Enterprise GCP users, high scalability.
- Provider:
google - Install:
pip install nexus-dev[google] - Default Model:
text-embedding-004 - Configuration:
{ "embedding_provider": "google", "google_project_id": "your-project-id", "google_location": "us-central1" }
- Environment: Uses standard Google Cloud credentials (ADC).
4. AWS Bedrock (Enterprise)
- Best for: Enterprise AWS users.
- Provider:
aws - Install:
pip install nexus-dev[aws] - Default Model:
amazon.titan-embed-text-v1 - Configuration:
{ "embedding_provider": "aws", "aws_region": "us-east-1" }
5. Voyage AI (High Performance)
- Best for: State-of-the-art retrieval quality (RAG specialist).
- Provider:
voyage - Install:
pip install nexus-dev[voyage] - Default Model:
voyage-large-2 - Configuration:
{ "embedding_provider": "voyage", "voyage_api_key": "your-key" }
6. Cohere (Multilingual)
- Best for: Multilingual search and reranking.
- Provider:
cohere - Install:
pip install nexus-dev[cohere] - Default Model:
embed-multilingual-v3.0 - Configuration:
{ "embedding_provider": "cohere", "cohere_api_key": "your-key" }
⚠️ Warning: Embeddings are NOT portable between providers. Changing providers requires re-indexing all documents.
Optional: Pre-commit Hook
Install automatic indexing on commits:
nexus-init --project-name "my-project" --install-hook
Or manually add to .git/hooks/pre-commit:
#!/bin/bash
MODIFIED=$(git diff --cached --name-only --diff-filter=ACM | grep -E '\.(py|js|ts|java)$')
if [ -n "$MODIFIED" ]; then
nexus-index $MODIFIED
fi
Configuring AI Agents
To maximize Nexus-Dev's value, configure your AI coding assistant to use its tools automatically.
Add AGENTS.md to Your Project
Copy our template to your project:
cp path/to/nexus-dev/docs/AGENTS_TEMPLATE.md your-project/AGENTS.md
This instructs AI agents to:
- Search first before implementing features
- Record lessons after solving bugs
- Use
get_project_context()at session start
Add Workflow Files (Optional)
cp -r path/to/nexus-dev/.agent/workflows your-project/.agent/
This adds slash commands: /start-session, /search-first, /record-lesson, /index-code
📖 See docs/configuring-agents.md for detailed setup instructions.
Architecture
flowchart TB
subgraph Agent["🤖 AI Agent"]
direction TB
Cursor["Cursor / Copilot / Windsurf"]
end
subgraph MCP["📡 Nexus-Dev (Gateway)"]
direction TB
subgraph Tools["MCP Tools"]
direction TB
search_knowledge["search_knowledge"]
search_code["search_code"]
search_docs["search_docs"]
index_file["index_file"]
gateway_tools["gateway_tools (new)"]
end
subgraph Chunkers["🔧 RAG Pipeline"]
Python["Chunkers"]
Embeddings["Embeddings"]
DB["LanceDB"]
end
end
subgraph External["🌍 External MCP Servers"]
GH["GitHub"]
PG["PostgreSQL"]
Other["..."]
end
Agent -->|"stdio"| Tools
Tools --> Chunkers
gateway_tools -.->|"invoke_tool"| External
Data Flow
sequenceDiagram
participant AI as AI Agent
participant MCP as Nexus-Dev
participant Embed as Embeddings
participant DB as LanceDB
Note over AI,DB: Indexing Flow
AI->>MCP: index_file(path)
MCP->>MCP: Parse with Chunker
MCP->>Embed: Generate embeddings
Embed-->>MCP: Vectors
MCP->>DB: Store chunks + vectors
DB-->>MCP: OK
MCP-->>AI: ✅ Indexed
Note over AI,DB: Search Flow
AI->>MCP: search_knowledge(query)
MCP->>Embed: Embed query
Embed-->>MCP: Query vector
MCP->>DB: Vector similarity search
DB-->>MCP: Results
MCP-->>AI: Formatted results
Development Setup
Since Nexus-Dev is not yet published to PyPI/Docker Hub, developers must build from source.
Option 1: Local Python Installation (Recommended for Development)
# Clone repository
git clone https://github.com/mmornati/nexus-dev.git
cd nexus-dev
# Option A: Use the Makefile (handles pyenv + venv)
make setup
source .venv/bin/activate
# Option B: Manual setup
pyenv install 3.13 # or use your preferred Python 3.13+ manager
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]" # Editable install with dev dependencies
After installation, CLI commands are available:
nexus-init --help # Initialize a project
nexus-index --help # Index files
nexus-dev # Run MCP server
Option 2: Docker Build
# Build the image
make docker-build
# or: docker build -t nexus-dev:latest .
# Run with volume mounts
docker run -it --rm \
-v /path/to/your-project:/workspace:ro \
-v nexus-dev-data:/data/nexus-dev \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
nexus-dev:latest
# Or use Makefile shortcuts
make docker-run # Run container
make docker-logs # View logs
make docker-stop # Stop container
Makefile Commands
| Command | Description |
|---|---|
make setup |
Full dev environment setup (pyenv + venv + deps) |
make install-dev |
Install package with dev dependencies |
make lint |
Run ruff linter |
make format |
Format code + auto-fix lint issues |
make check |
Run all CI checks (lint + format + type-check) |
make test |
Run tests |
make test-cov |
Run tests with coverage report |
make docker-build |
Build Docker image |
make docker-run |
Run Docker container |
make help |
Show all available commands |
MCP Configuration (Development Mode)
Configure your AI agent to use the locally-built server. This single configuration works for ALL your indexed projects!
For Claude Desktop / Cursor / Windsurf:
{
"mcpServers": {
"nexus-dev": {
"command": "/path/to/nexus-dev/.venv/bin/python",
"args": ["-m", "nexus_dev.server"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
}
How it works: The server now defaults to searching all indexed projects when no specific project context is active. You don't need to configure
cwdor create separate MCP entries for each project.
Tip: If
OPENAI_API_KEYis already in your shell profile (.zshrc,.bashrc), some clients inherit it automatically. Check your client's documentation.
Using Docker:
The /workspace mount is the server's working directory. It looks for nexus_config.json and .nexus/lessons/ there. Mount your project (or parent directory) to /workspace:
{
"mcpServers": {
"nexus-dev": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-v", "/path/to/project:/workspace:ro",
"-v", "nexus-dev-data:/data/nexus-dev",
"-e", "OPENAI_API_KEY",
"nexus-dev:latest"
]
}
}
}
Multi-Project Setup:
For multiple projects, you have two options:
-
Mount parent directory containing all projects:
"-v", "/Users/you/Projects:/workspace:ro"
Then index paths like
/workspace/project-a/src/,/workspace/project-b/src/. Each project needs its ownnexus_config.jsonwith a uniqueproject_id. -
Use local Python install (recommended): MCP clients automatically set the working directory to the project root, so no path configuration is needed.
Running Tests
make test # Run all tests
make test-cov # Run with coverage report
pytest tests/unit/ -v # Run specific test directory
Adding Language Support
See CONTRIBUTING.md for instructions on adding new language chunkers.
License
MIT License - see LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nexus_dev-2.1.0.tar.gz.
File metadata
- Download URL: nexus_dev-2.1.0.tar.gz
- Upload date:
- Size: 232.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
da5b422163a1b5742afc2b4aed8c622afdd44020a7cec8f9080b235d7716c6b1
|
|
| MD5 |
dd5e5f5e0f5004b4eb93932eff829bf9
|
|
| BLAKE2b-256 |
6d87b284e1516acbc8816eb42c91f28c869952e65987b8a7d3dd4881789f6475
|
File details
Details for the file nexus_dev-2.1.0-py3-none-any.whl.
File metadata
- Download URL: nexus_dev-2.1.0-py3-none-any.whl
- Upload date:
- Size: 56.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
08e05c8d560b9ec4ed179d868320a21a3a811757d8c5ef1a97e482ce456da7da
|
|
| MD5 |
2fd6f8ade80744932109f2c9fe004272
|
|
| BLAKE2b-256 |
9b57434b29b8434de612fd4f1deddef6e92cec1f403da93051304c3dabac68eb
|