Universal AI Memory Plugin — persistent, verified, hallucination-resistant memory for any LLM, agent, or coding workflow
Project description
🧠 Mnemosynth
Persistent, hallucination-resistant memory for any LLM, agent, or AI workflow.
pip install mnemosynth # Fast, lightweight install (Keyword-based engines)
pip install "mnemosynth[ml]" # Advanced install (Loads PyTorch/Transformers for Neural Models)
PyPI · Apache 2.0 · Python ≥ 3.10 · No external services required
The Problem
LLMs forget everything when a session ends. RAG helps — but raw retrieval doesn't verify, decay, or reason about what it stores. You get confident hallucinations, stale facts, and no audit trail.
Mnemosynth is a cognitive memory OS, not a vector store.
Three-Tier Memory Model
Inspired by how the brain actually works:
| Tier | Brain Region | Backend | What It Stores |
|---|---|---|---|
| Episodic | Hippocampus | LanceDB (vector) | Events, conversations, timestamped history |
| Semantic | Neocortex | NetworkX (graph) | Verified facts, entity relationships |
| Procedural | Cerebellum | JSON registry | Tools, schemas, workflows |
Memories are automatically classified on write. No manual tagging required.
Quickstart
Python API
from mnemosynth import Mnemosynth
brain = Mnemosynth()
# Store — auto-classified into episodic/semantic/procedural
brain.remember("User prefers Python and dark mode")
brain.remember("Yesterday we finalized the auth module")
brain.remember("Deploy with: docker compose up -d")
# Retrieve with semantic search
for r in brain.recall("What languages does the user prefer?"):
print(f"[{r.memory_type.value}] {r.content} ({r.confidence:.0%})")
# Compress relevant memories into a <150-token LLM context block
digest = brain.digest("Starting a new project")
# Dream: cluster episodic patterns -> promote to verified semantic facts
brain.dream()
MCP Server (Claude Desktop / Cursor / Windsurf)
{
"mcpServers": {
"mnemosynth": {
"command": "mnemosynth",
"args": ["serve"]
}
}
}
Everything runs locally under ~/.mnemosynth/. No external services.
Anti-Hallucination Engine
Mnemosynth doesn't just store — it defends:
- Forgetting curve — Ebbinghaus decay with configurable half-life; stale memories lose confidence
- Contradiction detection — Flags conflicting facts (Uses DeBERTa NLI if
[ml]is installed, otherwise falls back to fast keyword analysis) - Sentiment analysis — Emotion-aware memory retrieval (Uses DistilBERT if
[ml]is installed, otherwise falls back to fast keyword scoring) - Belief revision chains — Deprecated facts link forward to their replacements
- Corroboration scoring — Repeated observations boost confidence
- Memory immune system — Prompt injection detection, rate limiting, quarantine
Dream Consolidation
Run brain.dream() (or mnemosynth dream) to trigger offline consolidation:
- Cluster episodic memories with HDBSCAN
- Promote recurring patterns to verified semantic facts
- Decay memories that haven't been accessed recently
Think of it as sleep for your AI's memory.
MCP Tools
| Tool | Description |
|---|---|
add_memory |
Store with auto-classification |
search_memory |
Semantic search across all tiers |
get_digest |
Compressed XML context block for LLM injection |
get_contradictions |
Surface conflicting facts |
run_dream |
Trigger consolidation |
forget |
Delete by ID |
get_stats |
Memory statistics |
get_provenance |
Full audit trail for any memory |
CLI
mnemosynth serve # Start MCP server
mnemosynth stats # Memory dashboard
mnemosynth search "query" # Semantic search
mnemosynth inspect # Browse memory tree
mnemosynth dream # Run consolidation
mnemosynth health # System diagnostics
mnemosynth export -o out.json # Export to JSON or Markdown
mnemosynth forget <ID> # Delete a memory
mnemosynth reset --confirm # Wipe everything
Configuration
Override defaults at ~/.mnemosynth/config.yaml:
embedding_model: all-MiniLM-L6-v2
max_episodic_memories: 10000
max_semantic_nodes: 5000
decay:
half_life_days: 30.0
min_confidence: 0.1
dream:
interval_hours: 24
min_cluster_size: 3
digest:
max_tokens: 150
top_k: 5
Architecture
Claude Desktop / Cursor / Windsurf / Python API
| MCP (stdio) / Python import
Router · Digest · Decay · Dream
Sentiment · Contradiction · Immune System
|
Episodic (LanceDB) · Semantic (NetworkX) · Procedural (JSON)
|
~/.mnemosynth/
Built by Vasudev Jaiswal · Apache 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mnemosynth-1.0.0.tar.gz.
File metadata
- Download URL: mnemosynth-1.0.0.tar.gz
- Upload date:
- Size: 41.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
661bea0a07f51c93e611d02858fb1755cad5cb5c49271afb89599cd2d6e0b440
|
|
| MD5 |
2a178006c44e6654d64af4934b71c89c
|
|
| BLAKE2b-256 |
089d32ffe0299a38a237d27504868babc0cc72dd9423b03812aac97301afadf9
|
File details
Details for the file mnemosynth-1.0.0-py3-none-any.whl.
File metadata
- Download URL: mnemosynth-1.0.0-py3-none-any.whl
- Upload date:
- Size: 45.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ce58c7cbadd2459e5b62d7acc8eb42241130c154665c51600010d5833007d7e6
|
|
| MD5 |
f7ec9725505477192835fdc46e60aa91
|
|
| BLAKE2b-256 |
63a3eee10ecbd5c218d210d89b862b9d2b28326e8a36aa95df5c12cc318f5b2f
|