Temporal-first local code memory for AI tools via MCP
Project description
Contextual
Temporal-first local code memory for AI tools via MCP.
Contextual is a local-first, bi-temporal, code-aware semantic context engine that gives AI coding assistants persistent, accurate memory of your codebase — across tools, across time, without cloud dependencies.
The Problem
Every AI coding tool today suffers from the same fundamental flaw: context amnesia. Claude Code forgets what you discussed yesterday. Cursor re-indexes from scratch. Copilot has no memory of architectural decisions. Your AI assistant hallucinates function signatures that changed three commits ago because it has no concept of when things were true.
The result: wasted tokens, hallucinated code, repeated explanations, and lost architectural intent.
The Solution
Contextual maintains a bi-temporal knowledge graph of your codebase — tracking not just what is true, but when it became true and when you learned it was true. It exposes this knowledge through the Model Context Protocol (MCP), making it available to every AI tool in your stack simultaneously.
Developer writes code → Contextual indexes changes (tree-sitter + git blame)
→ Builds temporal knowledge graph (SQLite bi-temporal)
→ Embeds semantically (jina-v2-base-code + BM25)
→ Serves context via MCP to any AI tool
Key Features
- Temporal-First Memory — Every fact carries four timestamps (valid_at, invalid_at, created_at, expired_at). Ask "what was the API signature last Tuesday?" and get the right answer.
- Tool-Agnostic via MCP — Works with Claude Desktop, Claude Code, Cursor, VS Code Copilot, Gemini CLI, Windsurf, Zed, and any MCP-compatible client. One index, every tool.
- Local-First, Private by Design — All data stays on your machine. No cloud, no telemetry, no API keys required. Runs on 8GB RAM laptops.
- Hybrid Code Search — Combines dense semantic embeddings (jina-v2-base-code) with code-aware BM25 (camelCase/snake_case splitting) via Reciprocal Rank Fusion, reranked by a cross-encoder.
- Git-Native Indexing — Incremental indexing via post-commit hooks. Blame-based temporal attribution. Force-push and rebase aware.
- 7 Languages at Launch — Python, TypeScript, JavaScript, Go, Java, Rust, C# plus config formats (JSON, YAML, TOML, Dockerfile, Markdown).
Quick Start
# Install
uvx contextual
# Index your codebase
contextual index .
# Search
contextual search "authentication middleware"
# Temporal recall
contextual recall "UserService" --as-of "2026-05-01"
# Start MCP server for your IDE
contextual serve --stdio
IDE Integration
Run contextual setup <ide> to auto-configure any supported IDE, or configure manually:
Claude Desktop / Cursor / Windsurf
{
"mcpServers": {
"contextual": {
"command": "uvx",
"args": ["contextual", "serve", "--stdio"]
}
}
}
VS Code Copilot
{
"mcp.servers": {
"contextual": {
"type": "stdio",
"command": "uvx",
"args": ["contextual", "serve", "--stdio"]
}
}
}
Claude Code
claude mcp add contextual -- uvx contextual serve --stdio
Gemini CLI
{
"mcpServers": {
"contextual": {
"command": "uvx",
"args": ["contextual", "serve", "--stdio"]
}
}
}
MCP Tools
| Tool | Description |
|---|---|
index |
Index a codebase at a given path |
search |
Hybrid semantic + keyword code search |
recall |
Temporal recall — what did we know about X at time T? |
capture_decision |
Record an architectural decision |
freshness |
Check how stale the current index is |
status |
System health and statistics |
forget |
Invalidate a fact (never deletes — marks as expired) |
timeline |
Full temporal history of any entity |
Requirements
- Python 3.12+
- macOS, Linux, or Windows
- 8GB RAM minimum (16GB recommended)
- Git repository (full history recommended)
Documentation
- Architecture — System design and data flow
- Technical Specification — Complete implementation reference
- Roadmap — 30-day build plan
- ADRs — Architectural Decision Records
- Delegation Playbook — AI team task guide
License
Business Source License 1.1 — Change License: Apache 2.0, Change Date: 4 years from each release.
Acknowledgements
Built with FastMCP, LanceDB, tree-sitter, pygit2, fastembed, and tantivy-py. Temporal model inspired by Graphiti/Zep and Snodgrass's bi-temporal formalism.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file contextual_engine-0.1.0.tar.gz.
File metadata
- Download URL: contextual_engine-0.1.0.tar.gz
- Upload date:
- Size: 508.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9d88f2da5bc01888f6c5bb8accc427a91afa7c327ddcfd7ee8ea175a6c4dbb41
|
|
| MD5 |
733108a0b49932cc0381836b587f3172
|
|
| BLAKE2b-256 |
24e92262c7c747f166d2a259756f8ae2d5607a181f824e7cc18b23a3c77444b2
|
File details
Details for the file contextual_engine-0.1.0-py3-none-any.whl.
File metadata
- Download URL: contextual_engine-0.1.0-py3-none-any.whl
- Upload date:
- Size: 149.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f8dc75d12b79e29402ffcc897866452101b59380f6cbd5bffad48b6c42ba4a75
|
|
| MD5 |
0be6f4ac5aec82d6dfbcf669cb5844b0
|
|
| BLAKE2b-256 |
4466c7d978c8ddea27f00ec198567e6d12eb180e6b644e9e33fa1a55d70e6b76
|