Skip to main content

Local-first persistent memory for AI agents - store, recall, and consolidate knowledge across sessions using FAISS, SQLite, and any LLM

Project description

consolidation-memory

PyPI CI Python License

Local-first persistent memory for coding agents.

consolidation-memory stores episodic events, consolidates them into structured knowledge, and exposes a trust-aware retrieval stack (temporal recall, contradiction tracking, claim provenance, and drift challenge workflows).

What It Is

  • Episode storage with semantic dedup and FAISS indexing.
  • Hybrid recall across episodes, knowledge topics, structured records, and claims.
  • Claim graph with provenance (claim_sources) and lifecycle events (claim_events).
  • Temporal queries (as_of) for both knowledge and claims.
  • Drift detection that maps changed files to anchored claims and marks impacted claims challenged.
  • Multi-scope persistence (namespace/project/app/agent/session columns) with compatibility defaults.
  • Four access surfaces:
    • MCP server (consolidation-memory serve)
    • Python API (MemoryClient)
    • REST API (consolidation-memory serve --rest)
    • OpenAI-style tool schemas (consolidation_memory.schemas.openai_tools)

Install

pip install consolidation-memory[fastembed]

Common extras:

  • consolidation-memory[rest] for FastAPI endpoints
  • consolidation-memory[dashboard] for the Textual dashboard
  • consolidation-memory[all,dev] for full local development

Quick Start

consolidation-memory init
consolidation-memory test
consolidation-memory serve

consolidation-memory with no subcommand defaults to serve.

CLI Commands

serve            Start MCP server (default command)
serve --rest     Start REST API
init             Interactive setup
test             End-to-end health check
status           Runtime/system stats
consolidate      Trigger consolidation run
detect-drift     Challenge claims impacted by changed files
export           Export full snapshot JSON
import PATH      Import snapshot JSON
reindex          Rebuild vectors with current embedding backend
browse           Browse knowledge topics
setup-memory     Add reusable memory instructions to an agent file
dashboard        Launch Textual dashboard

MCP Setup

{
  "mcpServers": {
    "consolidation_memory": {
      "command": "/absolute/path/to/python",
      "args": ["-m", "consolidation_memory", "--project", "default", "serve"],
      "env": {
        "PYTHONUNBUFFERED": "1",
        "CONSOLIDATION_MEMORY_IDLE_TIMEOUT_SECONDS": "0"
      }
    }
  }
}

Prefer an exact Python interpreter over the consolidation-memory console script. It avoids PATH/env drift and is more reliable on Windows when MCP hosts restart the server. For long-lived MCP hosts, keep CONSOLIDATION_MEMORY_IDLE_TIMEOUT_SECONDS=0 unless you explicitly want the server to auto-exit when idle.

MCP tools exposed by server.py:

  • memory_store
  • memory_recall
  • memory_store_batch
  • memory_search
  • memory_claim_browse
  • memory_claim_search
  • memory_detect_drift
  • memory_status
  • memory_forget
  • memory_export
  • memory_correct
  • memory_compact
  • memory_consolidate
  • memory_consolidation_log
  • memory_decay_report
  • memory_protect
  • memory_timeline
  • memory_contradictions
  • memory_browse
  • memory_read_topic

Python Example

from consolidation_memory import MemoryClient

with MemoryClient(auto_consolidate=False) as mem:
    mem.store(
        "User prefers short PR summaries with concrete file paths.",
        content_type="preference",
        tags=["workflow", "reviews"],
    )

    result = mem.recall(
        "how should I format PR summaries?",
        n_results=5,
        include_knowledge=True,
    )

    print(len(result.episodes), len(result.knowledge), len(result.records), len(result.claims))

REST API

Run:

pip install consolidation-memory[rest]
consolidation-memory serve --rest --host 127.0.0.1 --port 8080

For non-loopback binds (for example --host 0.0.0.0), set auth first:

export CONSOLIDATION_MEMORY_REST_AUTH_TOKEN="change-me"
consolidation-memory serve --rest --host 0.0.0.0 --port 8080

When auth is enabled, send Authorization: Bearer <token> on all endpoints except /health.

Endpoints:

  • GET /health
  • POST /memory/store
  • POST /memory/store/batch
  • POST /memory/recall
  • POST /memory/search
  • POST /memory/claims/browse
  • POST /memory/claims/search
  • POST /memory/detect-drift
  • GET /memory/status
  • DELETE /memory/episodes/{episode_id}
  • POST /memory/consolidate
  • POST /memory/correct
  • POST /memory/export
  • POST /memory/compact
  • GET /memory/browse
  • GET /memory/topics/{filename}
  • POST /memory/timeline
  • POST /memory/contradictions
  • POST /memory/protect
  • POST /memory/consolidation-log
  • GET /memory/decay-report

OpenAI-Compatible Tools

Use:

  • consolidation_memory.schemas.openai_tools
  • consolidation_memory.schemas.dispatch_tool_call

This keeps tool definitions and dispatch behavior aligned with the same semantics used by MCP and REST.

Scope Model (Compatibility + Shared Use)

By default, existing single-project usage still works.

When a scope envelope is provided, records are persisted with explicit scope dimensions:

  • namespace_*
  • project_*
  • app_client_*
  • agent_*
  • session_*

This allows selective sharing without mixing unrelated contexts.

Optional scope.policy controls:

  • read_visibility: private (default), project, namespace
  • write_mode: allow (default), deny

Persisted ACL entities are also supported (access_policies, policy_principals, policy_acl_entries). When persisted ACL rows match the resolved scope/principal, they are authoritative over scope.policy. Conflict rules: write deny overrides allow; read visibility resolves to the most restrictive level.

Storage Layout

Data is under platformdirs.user_data_dir("consolidation_memory")/projects/<project>/.

memory.db
faiss_index.bin
faiss_id_map.json
faiss_tombstones.json
.faiss_reload
knowledge/
knowledge/versions/
consolidation_logs/
backups/

Configuration

Config file discovery:

  1. CONSOLIDATION_MEMORY_CONFIG
  2. Platform default config path
  3. Built-in defaults

Every scalar field can be overridden with CONSOLIDATION_MEMORY_<FIELD_NAME>.

Examples:

CONSOLIDATION_MEMORY_PROJECT=work
CONSOLIDATION_MEMORY_EMBEDDING_BACKEND=fastembed
CONSOLIDATION_MEMORY_LLM_BACKEND=ollama
CONSOLIDATION_MEMORY_CONSOLIDATION_INTERVAL_HOURS=6

Documentation Map

Development

git clone https://github.com/charliee1w/consolidation-memory
cd consolidation-memory
pip install -e ".[all,dev]"
python scripts/smoke_builder_base.py
pytest tests/ -q
ruff check src/ tests/
mypy src/consolidation_memory/

Community

License, Etc.

Project policies:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

consolidation_memory-0.13.6.tar.gz (266.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

consolidation_memory-0.13.6-py3-none-any.whl (183.4 kB view details)

Uploaded Python 3

File details

Details for the file consolidation_memory-0.13.6.tar.gz.

File metadata

  • Download URL: consolidation_memory-0.13.6.tar.gz
  • Upload date:
  • Size: 266.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for consolidation_memory-0.13.6.tar.gz
Algorithm Hash digest
SHA256 da75c6fe0e79887825872471ad3bd9cc7ddd3f7abb9a63f1f9ec8dedec1329e1
MD5 12030f39e8db55e4c0ada6e2f1097597
BLAKE2b-256 6c481d7e11726d4fa31d1a2db0761821a47b4aeb6fab255a7d160b6391b6d365

See more details on using hashes here.

Provenance

The following attestation bundles were made for consolidation_memory-0.13.6.tar.gz:

Publisher: publish.yml on charliee1w/consolidation-memory

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file consolidation_memory-0.13.6-py3-none-any.whl.

File metadata

File hashes

Hashes for consolidation_memory-0.13.6-py3-none-any.whl
Algorithm Hash digest
SHA256 6f5bd01c657058bf3df843d354691c35cf7499e9a9df74003386a25155063ded
MD5 558e7c2ee87452ae4a880a87bb660185
BLAKE2b-256 787bfae1f26557ba96f78833898c9e5637aa9a7c5945a7b4346636917778bc1c

See more details on using hashes here.

Provenance

The following attestation bundles were made for consolidation_memory-0.13.6-py3-none-any.whl:

Publisher: publish.yml on charliee1w/consolidation-memory

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page