Local-first persistent memory for AI agents - store, recall, and consolidate knowledge across sessions using FAISS, SQLite, and any LLM
Project description
consolidation-memory
Local-first persistent memory for coding agents.
consolidation-memory stores episodic events, consolidates them into structured knowledge, and exposes a trust-aware retrieval stack (temporal recall, contradiction tracking, claim provenance, and drift challenge workflows).
What It Is
- Episode storage with semantic dedup and FAISS indexing.
- Hybrid recall across episodes, knowledge topics, structured records, and claims.
- Claim graph with provenance (
claim_sources) and lifecycle events (claim_events). - Temporal queries (
as_of) for both knowledge and claims. - Drift detection that maps changed files to anchored claims and marks impacted claims challenged.
- Multi-scope persistence (namespace/project/app/agent/session columns) with compatibility defaults.
- Four access surfaces:
- MCP server (
consolidation-memory serve) - Python API (
MemoryClient) - REST API (
consolidation-memory serve --rest) - OpenAI-style tool schemas (
consolidation_memory.schemas.openai_tools)
- MCP server (
Install
pip install consolidation-memory[fastembed]
Common extras:
consolidation-memory[rest]for FastAPI endpointsconsolidation-memory[dashboard]for the Textual dashboardconsolidation-memory[all,dev]for full local development
Quick Start
consolidation-memory init
consolidation-memory test
consolidation-memory serve
consolidation-memory with no subcommand defaults to serve.
CLI Commands
serve Start MCP server (default command)
serve --rest Start REST API
init Interactive setup
test End-to-end health check
status Runtime/system stats
consolidate Trigger consolidation run
detect-drift Challenge claims impacted by changed files
export Export full snapshot JSON
import PATH Import snapshot JSON
reindex Rebuild vectors with current embedding backend
browse Browse knowledge topics
setup-memory Add reusable memory instructions to an agent file
dashboard Launch Textual dashboard
MCP Setup
{
"mcpServers": {
"consolidation_memory": {
"command": "/absolute/path/to/python",
"args": ["-m", "consolidation_memory", "--project", "default", "serve"],
"env": {
"PYTHONUNBUFFERED": "1",
"CONSOLIDATION_MEMORY_IDLE_TIMEOUT_SECONDS": "0"
}
}
}
}
Prefer an exact Python interpreter over the consolidation-memory console script. It avoids PATH/env drift and is more reliable on Windows when MCP hosts restart the server.
For long-lived MCP hosts, keep CONSOLIDATION_MEMORY_IDLE_TIMEOUT_SECONDS=0 unless you explicitly want the server to auto-exit when idle.
MCP tools exposed by server.py:
memory_storememory_recallmemory_store_batchmemory_searchmemory_claim_browsememory_claim_searchmemory_detect_driftmemory_statusmemory_forgetmemory_exportmemory_correctmemory_compactmemory_consolidatememory_consolidation_logmemory_decay_reportmemory_protectmemory_timelinememory_contradictionsmemory_browsememory_read_topic
Python Example
from consolidation_memory import MemoryClient
with MemoryClient(auto_consolidate=False) as mem:
mem.store(
"User prefers short PR summaries with concrete file paths.",
content_type="preference",
tags=["workflow", "reviews"],
)
result = mem.recall(
"how should I format PR summaries?",
n_results=5,
include_knowledge=True,
)
print(len(result.episodes), len(result.knowledge), len(result.records), len(result.claims))
REST API
Run:
pip install consolidation-memory[rest]
consolidation-memory serve --rest --host 127.0.0.1 --port 8080
For non-loopback binds (for example --host 0.0.0.0), set auth first:
export CONSOLIDATION_MEMORY_REST_AUTH_TOKEN="change-me"
consolidation-memory serve --rest --host 0.0.0.0 --port 8080
When auth is enabled, send Authorization: Bearer <token> on all endpoints except /health.
Endpoints:
GET /healthPOST /memory/storePOST /memory/store/batchPOST /memory/recallPOST /memory/searchPOST /memory/claims/browsePOST /memory/claims/searchPOST /memory/detect-driftGET /memory/statusDELETE /memory/episodes/{episode_id}POST /memory/consolidatePOST /memory/correctPOST /memory/exportPOST /memory/compactGET /memory/browseGET /memory/topics/{filename}POST /memory/timelinePOST /memory/contradictionsPOST /memory/protectPOST /memory/consolidation-logGET /memory/decay-report
OpenAI-Compatible Tools
Use:
consolidation_memory.schemas.openai_toolsconsolidation_memory.schemas.dispatch_tool_call
This keeps tool definitions and dispatch behavior aligned with the same semantics used by MCP and REST.
Scope Model (Compatibility + Shared Use)
By default, existing single-project usage still works.
When a scope envelope is provided, records are persisted with explicit scope dimensions:
namespace_*project_*app_client_*agent_*session_*
This allows selective sharing without mixing unrelated contexts.
Optional scope.policy controls:
read_visibility:private(default),project,namespacewrite_mode:allow(default),deny
Persisted ACL entities are also supported (access_policies, policy_principals, policy_acl_entries).
When persisted ACL rows match the resolved scope/principal, they are authoritative over scope.policy.
Conflict rules: write deny overrides allow; read visibility resolves to the most restrictive level.
Storage Layout
Data is under platformdirs.user_data_dir("consolidation_memory")/projects/<project>/.
memory.db
faiss_index.bin
faiss_id_map.json
faiss_tombstones.json
.faiss_reload
knowledge/
knowledge/versions/
consolidation_logs/
backups/
Configuration
Config file discovery:
CONSOLIDATION_MEMORY_CONFIG- Platform default config path
- Built-in defaults
Every scalar field can be overridden with CONSOLIDATION_MEMORY_<FIELD_NAME>.
Examples:
CONSOLIDATION_MEMORY_PROJECT=work
CONSOLIDATION_MEMORY_EMBEDDING_BACKEND=fastembed
CONSOLIDATION_MEMORY_LLM_BACKEND=ollama
CONSOLIDATION_MEMORY_CONSOLIDATION_INTERVAL_HOURS=6
Documentation Map
- Architecture
- Roadmap
- Release Gates
- Novelty Metrics
- Novelty Eval Guide
- Builder Baseline
- External Review Playbook
- Recommended Agent Instructions
- Universal-memory strategy docs
Development
git clone https://github.com/charliee1w/consolidation-memory
cd consolidation-memory
pip install -e ".[all,dev]"
python scripts/smoke_builder_base.py
pytest tests/ -q
ruff check src/ tests/
mypy src/consolidation_memory/
Community
- Contributors: CONTRIBUTORS.md
- Issues: GitHub Issues
- Discussions: GitHub Discussions
License, Etc.
Project policies:
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file consolidation_memory-0.13.6.tar.gz.
File metadata
- Download URL: consolidation_memory-0.13.6.tar.gz
- Upload date:
- Size: 266.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
da75c6fe0e79887825872471ad3bd9cc7ddd3f7abb9a63f1f9ec8dedec1329e1
|
|
| MD5 |
12030f39e8db55e4c0ada6e2f1097597
|
|
| BLAKE2b-256 |
6c481d7e11726d4fa31d1a2db0761821a47b4aeb6fab255a7d160b6391b6d365
|
Provenance
The following attestation bundles were made for consolidation_memory-0.13.6.tar.gz:
Publisher:
publish.yml on charliee1w/consolidation-memory
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
consolidation_memory-0.13.6.tar.gz -
Subject digest:
da75c6fe0e79887825872471ad3bd9cc7ddd3f7abb9a63f1f9ec8dedec1329e1 - Sigstore transparency entry: 1077410340
- Sigstore integration time:
-
Permalink:
charliee1w/consolidation-memory@779a4db91d5a7beebda4274939a63ca1dbf38e94 -
Branch / Tag:
refs/tags/v0.13.6 - Owner: https://github.com/charliee1w
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@779a4db91d5a7beebda4274939a63ca1dbf38e94 -
Trigger Event:
push
-
Statement type:
File details
Details for the file consolidation_memory-0.13.6-py3-none-any.whl.
File metadata
- Download URL: consolidation_memory-0.13.6-py3-none-any.whl
- Upload date:
- Size: 183.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6f5bd01c657058bf3df843d354691c35cf7499e9a9df74003386a25155063ded
|
|
| MD5 |
558e7c2ee87452ae4a880a87bb660185
|
|
| BLAKE2b-256 |
787bfae1f26557ba96f78833898c9e5637aa9a7c5945a7b4346636917778bc1c
|
Provenance
The following attestation bundles were made for consolidation_memory-0.13.6-py3-none-any.whl:
Publisher:
publish.yml on charliee1w/consolidation-memory
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
consolidation_memory-0.13.6-py3-none-any.whl -
Subject digest:
6f5bd01c657058bf3df843d354691c35cf7499e9a9df74003386a25155063ded - Sigstore transparency entry: 1077410341
- Sigstore integration time:
-
Permalink:
charliee1w/consolidation-memory@779a4db91d5a7beebda4274939a63ca1dbf38e94 -
Branch / Tag:
refs/tags/v0.13.6 - Owner: https://github.com/charliee1w
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@779a4db91d5a7beebda4274939a63ca1dbf38e94 -
Trigger Event:
push
-
Statement type: