Production-grade memory reliability system for AI coding agents. Lifecycle-managed claims with citations, conflict detection, steward governance, and MCP integration.
Project description
MemoryMaster
Production-grade memory reliability system for AI coding agents.
Lifecycle-managed claims with citations, conflict detection, steward governance, hybrid retrieval, and MCP integration. Give your AI agents persistent, trustworthy memory.
MemoryMaster prevents the #1 problem with agent memory: drift, stale assumptions, and unsafe disclosure. It gives Claude Code, Codex, and any MCP-compatible agent persistent, verifiable memory with a full claim lifecycle, citation tracking, conflict detection, and human-in-the-loop governance.
Architecture
┌─────────────────────────────────────────────────────────────────┐
│ Agent Runtime │
│ (Claude Code / Codex / any MCP-compatible agent) │
└────────────┬────────────────────────────────┬───────────────────┘
│ MCP (22 tools) │ CLI (64 commands)
v v
┌─────────────────────────────────────────────────────────────────┐
│ MemoryMaster Core │
│ ┌──────────┐ ┌───────────┐ ┌──────────┐ ┌───────────────┐ │
│ │ Ingestor │ │ Extractor │ │ Validator │ │ State Engine │ │
│ │ (events) │->│ (claims) │->│ (probes) │->│ (6-state FSM) │ │
│ └──────────┘ └───────────┘ └──────────┘ └───────────────┘ │
│ ┌──────────┐ ┌───────────┐ ┌──────────┐ ┌───────────────┐ │
│ │ Retrieval│ │ Compactor │ │ Steward │ │ Dashboard │ │
│ │ (hybrid) │ │ (archive) │ │ (govern) │ │ (HTML+SSE) │ │
│ └──────────┘ └───────────┘ └──────────┘ └───────────────┘ │
└────────┬──────────────┬──────────────┬──────────┬───────────────┘
v v v v
SQLite/Postgres Qdrant Ollama/CLI Claude Code
(vectors) (LLM stack) Auto Dream + Vault
Key features
- 6-state lifecycle:
candidate→confirmed→stale→superseded→conflicted→archived - Citation tracking with provenance for every claim
- Hybrid retrieval: vector (sentence-transformers / Gemini) + FTS5 + freshness + confidence
- Context optimizer:
query_for_context(budget=4000)returns auto-curated memory that fits your token budget - Entity graph with typed relationships and alias resolution
- Steward governance: multi-probe validators (filesystem, format, citation, semantic, tool) with proposal review
- Conflict resolution: 5-tier auto (confidence > freshness > citations > LLM > manual)
- Auto-redaction at ingest: JWT, GitHub tokens, Bearer, AWS keys, SSH keys, custom patterns
- LLM Wiki: compiled-truth + append-only timeline articles with progressive-disclosure frontmatter
- Dual backend: SQLite (zero-config) and Postgres (full feature parity with pgvector)
- Dream Bridge for bidirectional sync with Claude Code's Auto Dream
- 7-hook stack: recall, classify, validate-wiki, session-start, auto-ingest, precompact, steward-cron
Full feature index lives in docs/handbook.md.
Prerequisites
Required
- Python 3.10+ with
pip - Claude Code, Codex, or any MCP-compatible agent
Optional
- Already a Claude Code subscriber? No API key needed — set
MEMORYMASTER_LLM_PROVIDER=claude_cliand the steward + auto-ingest hooks will use your existing OAuth via the localclaude --printbinary - A free Gemini API key from aistudio.google.com — powers the auto-ingest hook at ~zero cost
- Node.js 18+ for graphify and GitNexus
- Obsidian 1.6+ with the Bases core plugin (for visual wiki browsing)
- Docker for Qdrant (SQLite FTS5 is the default and works out of the box)
Quick start
pip install "memorymaster[mcp]"
memorymaster --db memorymaster.db init-db
memorymaster-setup # interactive: hooks, MCP, steward cron, CLAUDE.md / AGENTS.md
That's enough to use the CLI, the MCP server, and the auto-ingest Stop hook.
# Ingest a claim with citation
memorymaster --db memorymaster.db ingest \
--text "Server uses PostgreSQL 16" \
--source "session://chat|turn-3|user confirmed"
# Query memory (hybrid retrieval)
memorymaster --db memorymaster.db query "database version" --retrieval-mode hybrid
# Context optimizer — the killer feature for agents
memorymaster --db memorymaster.db context "auth patterns" --budget 4000 --format xml
# Run validation cycle
memorymaster --db memorymaster.db run-cycle
For the one-prompt agent install (paste into any agent with shell access), see docs/handbook.md#one-prompt-agent-install.
Pick your LLM provider
| Provider | Env vars | Default model | Cost |
|---|---|---|---|
| Claude Code OAuth (recommended for subscribers) | MEMORYMASTER_LLM_PROVIDER=claude_cli (requires claude CLI on PATH) |
claude-haiku-4-5-20251001 |
included in Claude Code plan |
| Google Gemini (default) | MEMORYMASTER_LLM_PROVIDER=google + GEMINI_API_KEY=... |
gemini-3.1-flash-lite-preview |
~free |
| OpenAI | MEMORYMASTER_LLM_PROVIDER=openai + OPENAI_API_KEY=... |
gpt-4o-mini |
~$0.001/call |
| Anthropic API | MEMORYMASTER_LLM_PROVIDER=anthropic + ANTHROPIC_API_KEY=... |
claude-haiku-4-5-20251001 |
~$0.001/call |
| Ollama (local) | MEMORYMASTER_LLM_PROVIDER=ollama + OLLAMA_URL=http://localhost:11434 |
llama3.2:3b |
free |
The claude_cli provider shells out to your local claude --print binary, so it inherits the OAuth session you're already logged into in Claude Code — no API key, no rotator, no quota juggling. Caveat: cold-start adds 3-15s per call (subprocess spawn), so it's ideal for batched/cron paths (steward, wiki-absorb) and not for latency-sensitive recall. Override with MEMORYMASTER_CLAUDE_CLI_BIN and MEMORYMASTER_CLAUDE_CLI_TIMEOUT. On VM installs the OAuth token expires ~24h, so pair with MEMORYMASTER_LLM_FALLBACK_PROVIDER=ollama; desktop tokens don't expire.
For zero-cost offline use, install Ollama, ollama pull llama3.2:3b, and set MEMORYMASTER_LLM_PROVIDER=ollama.
MCP server
{
"mcpServers": {
"memorymaster": {
"command": "memorymaster-mcp",
"env": {
"MEMORYMASTER_DEFAULT_DB": "/path/to/memorymaster.db",
"MEMORYMASTER_WORKSPACE": "/path/to/your/project"
}
}
}
}
22 MCP tools: init_db, ingest_claim, run_cycle, run_steward, classify_query, query_memory, query_for_context, list_claims, redact_claim_payload, pin_claim, compact_memory, list_events, search_verbatim, open_dashboard, list_steward_proposals, resolve_steward_proposal, extract_entities, entity_stats, find_related_claims, quality_scores, recompute_tiers, federated_query.
See .mcp.json.example for the full template.
Backends
| Backend | Install | Use case |
|---|---|---|
| SQLite | Built-in | Local development, single-agent, zero-config |
| Postgres | pip install "memorymaster[postgres]" |
Team deployment, multi-agent, pgvector search |
Docker Compose
Run the full stack (MemoryMaster + Qdrant + Ollama) with one command:
docker compose up -d
See INSTALLATION.md for Kubernetes / Helm.
Development
# Install with dev dependencies
pip install -e ".[dev,mcp,security,embeddings,qdrant]"
# Run tests
pytest tests/ -q
# Lint and format
ruff check memorymaster/ && ruff format memorymaster/
# Performance benchmarks
python benchmarks/perf_smoke.py
See CONTRIBUTING.md for the full workflow.
Documentation
| Document | Description |
|---|---|
| docs/handbook.md | Full operator handbook — hooks, dashboard, steward, dream bridge, troubleshooting, one-prompt install |
| INSTALLATION.md | Setup guide: pip, Docker, Helm, MCP config |
| CONTRIBUTING.md | Dev setup, testing, PR workflow |
| ARCHITECTURE.md | System design and subsystem details |
| USER_GUIDE.md | Usage, MCP integration, troubleshooting |
| CHANGELOG.md | Version history and release notes |
| ROADMAP.md | Release plan and future tracks |
| docs/enabling-v2-systems.md | v3 statistical classifier + cadence policy opt-in |
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file memorymaster-3.12.0.tar.gz.
File metadata
- Download URL: memorymaster-3.12.0.tar.gz
- Upload date:
- Size: 720.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
321e83e8170ee28c76a72f1e9fbb58a897af09b9ba386e7062e472c3faab97f6
|
|
| MD5 |
ce90e1b4445d2b3e06e71c294dcfc33b
|
|
| BLAKE2b-256 |
31f41be1a65ea8cac4208ee2ce79d2036409a25ccece031f3ba46df33a30017d
|
Provenance
The following attestation bundles were made for memorymaster-3.12.0.tar.gz:
Publisher:
publish.yml on wolverin0/memorymaster
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
memorymaster-3.12.0.tar.gz -
Subject digest:
321e83e8170ee28c76a72f1e9fbb58a897af09b9ba386e7062e472c3faab97f6 - Sigstore transparency entry: 1395401244
- Sigstore integration time:
-
Permalink:
wolverin0/memorymaster@e75fea820b7d6b6305d3def73fc5c22e50c7018c -
Branch / Tag:
refs/tags/v3.12.0 - Owner: https://github.com/wolverin0
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@e75fea820b7d6b6305d3def73fc5c22e50c7018c -
Trigger Event:
push
-
Statement type:
File details
Details for the file memorymaster-3.12.0-py3-none-any.whl.
File metadata
- Download URL: memorymaster-3.12.0-py3-none-any.whl
- Upload date:
- Size: 864.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6b07b190aa3bfc2756bee70e4a6bc755d0d8d11b7eaa49ce5bbf0555c464cef3
|
|
| MD5 |
a95f341e7bf55ff34c89b09edbf49e17
|
|
| BLAKE2b-256 |
d9cde53c31dbc11c45644fbd0bdea40e7971b2ddd3691be1d6fe866b98f3e274
|
Provenance
The following attestation bundles were made for memorymaster-3.12.0-py3-none-any.whl:
Publisher:
publish.yml on wolverin0/memorymaster
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
memorymaster-3.12.0-py3-none-any.whl -
Subject digest:
6b07b190aa3bfc2756bee70e4a6bc755d0d8d11b7eaa49ce5bbf0555c464cef3 - Sigstore transparency entry: 1395401311
- Sigstore integration time:
-
Permalink:
wolverin0/memorymaster@e75fea820b7d6b6305d3def73fc5c22e50c7018c -
Branch / Tag:
refs/tags/v3.12.0 - Owner: https://github.com/wolverin0
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@e75fea820b7d6b6305d3def73fc5c22e50c7018c -
Trigger Event:
push
-
Statement type: