Universal AI Memory Plugin — persistent, verified, hallucination-resistant memory for any LLM, agent, or coding workflow
Project description
🧠 Mnemosynth
Cognitive Memory OS for AI
Persistent, verified, hallucination-resistant memory for any LLM, agent, or AI workflow.
The Problem
LLMs forget everything when a session ends. RAG helps — but raw retrieval doesn't verify, decay, or reason about what it stores. You get confident hallucinations, stale facts, and no audit trail.
Mnemosynth is a cognitive memory OS, not a vector store.
Install
pip install mnemosynth # Lightweight (keyword engines)
pip install "mnemosynth[ml]" # Advanced (PyTorch + Transformers)
Zero-config. No Docker. No external databases. No API keys. Everything runs locally under
~/.mnemosynth/.
Three-Tier Memory Model
Inspired by how the brain actually works:
| Tier | Brain Region | Backend | What It Stores |
|---|---|---|---|
| Episodic 🔵 | Hippocampus | LanceDB (vector) | Events, conversations, timestamped history |
| Semantic 🟢 | Neocortex | NetworkX (graph) | Verified facts, entity relationships |
| Procedural 🟠 | Cerebellum | JSON registry | Tools, schemas, workflows |
Memories are auto-classified on write. No manual tagging required.
Quickstart
Python API
from mnemosynth import Mnemosynth
brain = Mnemosynth()
# Store — auto-classified into episodic/semantic/procedural
brain.remember("User prefers Python and dark mode")
brain.remember("Yesterday we finalized the auth module")
brain.remember("Deploy with: docker compose up -d")
# Retrieve with semantic search
for r in brain.recall("What languages does the user prefer?"):
print(f"[{r.memory_type.value}] {r.content} ({r.confidence:.0%})")
# Compressed context block for LLM injection
digest = brain.digest("Starting a new project")
# Dream: cluster episodic → promote to verified semantic facts
brain.dream()
MCP Server (Claude Desktop / Cursor / Windsurf)
{
"mcpServers": {
"mnemosynth": {
"command": "mnemosynth",
"args": ["serve"]
}
}
}
Anti-Hallucination Engine
| Feature | Description |
|---|---|
| 🧊 Ebbinghaus Decay | Stale memories lose confidence over time (configurable half-life) |
| ⚔️ Contradiction Detection | Flags conflicting facts via DeBERTa NLI or keyword antonyms |
| 💜 Sentiment Scoring | DistilBERT or keyword-based emotional valence scoring |
| 🛡️ Immune System | Blocks prompt injections, rate limits, quarantines threats |
| 📊 Corroboration | Repeated observations boost confidence |
| 🔗 Belief Revision | Deprecated facts link forward to their replacements |
Agentic Framework Adapters
First-class integrations for every major AI framework:
# CrewAI
from mnemosynth.adapters.crewai import get_crewai_tools
tools = get_crewai_tools(brain)
# LangChain / LangGraph
from mnemosynth.adapters.langchain import get_mnemosynth_tools, MnemosynthMemory
# AutoGen
from mnemosynth.adapters.autogen import get_autogen_tools
# Cross-Agent Memory Bus
from mnemosynth.adapters.memory_bus import MemoryBus
bus = MemoryBus()
bus.remember("shared fact", namespace="researcher")
Install adapter deps: pip install "mnemosynth[crewai]", "mnemosynth[langchain]", etc.
Causal Memory Chains
Track Decision → Reason → Outcome DAGs:
from mnemosynth.engine.causal import CausalChainEngine
engine = CausalChainEngine(brain)
engine.record_chain(
decision="Switched from PostgreSQL to SQLite",
reasons=["Need zero-config deployment", "Single-user workload"],
outcome="Reduced setup time from 30min to 0",
)
# Query chains by topic
chains = engine.search("database decisions")
# Get XML digest for LLM prompts
digest = engine.get_digest("why did we switch databases?")
Agent-to-Agent (A2A) Protocol
Cross-agent memory sharing with visibility controls:
from mnemosynth.a2a import A2AProtocol, MemoryRequest
proto = A2AProtocol(brain)
proto.register_agent("researcher", capabilities=["search", "analyze"])
# Agent requests memories
response = proto.handle_request(MemoryRequest(agent_id="coder", query="user preferences"))
# Agent shares memories
proto.share_memory("User prefers dark mode", source_agent="researcher", visibility="shared")
OpenTelemetry Tracing
Full observability with zero-effort instrumentation:
from mnemosynth.telemetry import instrument_brain, get_metrics
brain = Mnemosynth()
instrument_brain(brain) # All operations now emit traces + metrics
# Or use the pre-instrumented wrapper:
from mnemosynth.telemetry import TracedMnemosynth
brain = TracedMnemosynth()
# Check metrics
print(get_metrics()) # {remember_total: 42, recall_duration_ms_avg: 3.2, ...}
Production Database Backends
Scale beyond the built-in stores:
# Qdrant (vector search at scale)
from mnemosynth.stores.qdrant_store import QdrantEpisodicStore
store = QdrantEpisodicStore(url="http://localhost:6333")
# FalkorDB (graph at scale)
from mnemosynth.stores.falkordb_store import FalkorDBSemanticStore
store = FalkorDBSemanticStore(host="localhost", port=6379)
# PostgreSQL (ACID persistence)
from mnemosynth.stores.postgres_store import PostgresStore
store = PostgresStore(dsn="postgresql://user:pass@localhost/mnemosynth")
Install: pip install "mnemosynth[production]"
MCP Tools (8 tools)
| Tool | Description |
|---|---|
add_memory |
Store with auto-classification |
search_memory |
Semantic search across all tiers |
get_digest |
Compressed XML context block |
get_contradictions |
Surface conflicting facts |
run_dream |
Trigger consolidation |
forget |
Delete by ID |
get_stats |
Memory statistics |
get_provenance |
Full audit trail |
CLI
mnemosynth serve # Start MCP server
mnemosynth stats # Memory dashboard
mnemosynth search "query" # Semantic search
mnemosynth inspect # Browse memory tree
mnemosynth dream # Run consolidation
mnemosynth health # System diagnostics
mnemosynth export -o out.json # Export to JSON
mnemosynth forget <ID> # Delete a memory
mnemosynth reset --confirm # Wipe everything
Configuration
Override defaults at ~/.mnemosynth/config.yaml:
embedding_model: all-MiniLM-L6-v2
max_episodic_memories: 10000
max_semantic_nodes: 5000
decay:
half_life_days: 30.0
min_confidence: 0.1
dream:
interval_hours: 24
min_cluster_size: 3
digest:
max_tokens: 150
top_k: 5
Architecture
Claude Desktop / Cursor / Windsurf / Python API / CrewAI / LangChain
| MCP (stdio) / Python import / A2A Protocol
Router · Digest · Decay · Dream · Causal Chains
Sentiment · Contradiction · Immune · Telemetry
|
Episodic (LanceDB/Qdrant) · Semantic (NetworkX/FalkorDB) · Procedural (JSON)
|
~/.mnemosynth/ (or PostgreSQL)
Optional Dependencies
| Extra | What It Adds |
|---|---|
mnemosynth[ml] |
PyTorch, Transformers (DeBERTa, DistilBERT) |
mnemosynth[production] |
Qdrant, FalkorDB, PostgreSQL, OpenTelemetry |
mnemosynth[crewai] |
CrewAI adapter |
mnemosynth[langchain] |
LangChain adapter |
mnemosynth[autogen] |
AutoGen adapter |
mnemosynth[adapters] |
All framework adapters |
Built by Vasudev Jaiswal · Apache 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mnemosynth-1.0.1.tar.gz.
File metadata
- Download URL: mnemosynth-1.0.1.tar.gz
- Upload date:
- Size: 66.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e64482727716f9ebf2eb27866c43946da16744b3e2744fc272ae06c63c69c367
|
|
| MD5 |
64346ed72a150f03237d05cfaacd3f1f
|
|
| BLAKE2b-256 |
1c0b67e3a9c0925a094e2987a75edd8a685e600f332372ac65b99e219da68f6a
|
File details
Details for the file mnemosynth-1.0.1-py3-none-any.whl.
File metadata
- Download URL: mnemosynth-1.0.1-py3-none-any.whl
- Upload date:
- Size: 71.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8bdea7e3cb67a33e3388f352234a4e3dd162e06846c0fd296616d8d80e659ce8
|
|
| MD5 |
623fcee9338c7f5d89add548682acd7e
|
|
| BLAKE2b-256 |
5beff74d18a084585dab6092212b3a796abdfb1b0b049e621129f1859d5c34c1
|