Skip to main content

N3MemoryCore MCP Lite — ephemeral (Redis-backed, 7d TTL) hybrid (vector + BM25) memory for Claude and other MCP clients

Project description

N3MemoryCore MCP — Lite (Ephemeral)

N3MC-MCP-Lite is an "external memory server" used by MCP-compatible editors such as Claude Code, Cursor, and Windsurf. It runs as an MCP Server so AI can save and search conversation and code context across sessions.

A NeuralNexusNote™ product — free Lite build: ephemeral hybrid (vector + BM25) memory exposed as a Model Context Protocol server, backed by Redis Stack with a 7-day TTL per entry.

🇯🇵 日本語版はこちら 🛡️ Development Philosophy


Lite vs. Paid

Build Storage Durability Where
Lite (this repo) Redis Stack (RediSearch) 7d TTL, volatile Claude Marketplace
Paid SQLite + sqlite-vec (local file) Permanent Separate distribution

Same MCP surface (five tools, same ranking formula). The 7-day TTL and volatile Redis storage are design features, not limitations — they make the Lite build the better fit for:

  • Agentic code-generation loops — failed attempts and abandoned designs don't bleed into the next task; docker restart redis-stack wipes the slate clean.
  • Multi-agent collaboration — decisions made during one task don't contaminate unrelated follow-ups.
  • Experimental / throwaway prototyping — leave it alone and memory evaporates in 7 days, no pruning needed.

The Paid build targets the opposite use case: long-term knowledge accumulation where persistence is the feature. Pick the Lite for project-scoped memory; pick the Paid for continuous memory.

What is this?

n3memorycore-mcp-lite is a local-only MCP server that gives Claude (and any other MCP-compatible client) short-lived memory across conversations. It stores text entries in a local Redis Stack instance with both a BM25 full-text index and a 768-dimension vector index (intfloat/e5-base-v2), and returns hybrid-ranked results.

Every operation runs on the user's machine. No API calls, no cloud storage.

Tools exposed

Tool Purpose
search_memory Hybrid (vector + BM25) search, ranked & time-decayed
save_memory Persist a short entry (7d TTL, dedup: exact + near-duplicate)
list_memories Most-recent entries, newest first
delete_memory Remove a specific entry by id
repair_memory Re-create the RediSearch index if missing

The server also ships behavioral instructions via MCP's initialize response, asking the client to search_memory at the start of each turn and save_memory after each meaningful exchange — so "auto-save" is preserved without any Claude Code hooks.

Prerequisites

1. Start Redis Stack

The Lite build requires Redis Stack (Redis + RediSearch module). The easiest way is Docker:

docker run -d --name redis-stack -p 6379:6379 redis/redis-stack-server:latest

That's it — the container exposes Redis on localhost:6379 and the server will find it automatically.

2. Install the package

Install from source (PyPI distribution is not yet available):

git clone https://github.com/NeuralNexusNote/n3mcmcp-lite
cd n3mcmcp-lite
pip install -e .

The first run downloads the ~400 MB embedding model from Hugging Face into the standard ~/.cache/huggingface/ directory.

Configure a client

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "n3memorycore-lite": {
      "command": "n3mc-mcp-lite",
      "args": []
    }
  }
}

Claude Code

.mcp.json is already included in this repository. Clone the repo, install the package, and Claude Code connects automatically — no manual configuration needed.

For other projects, add the following to that project's .mcp.json:

{
  "mcpServers": {
    "n3memorycore-lite": {
      "type": "stdio",
      "command": "n3mc-mcp-lite",
      "args": []
    }
  }
}

Data location

The Lite build does not store a database on disk — memories live in Redis and expire automatically. Only a small config.json sits in the platform-standard user data directory:

OS Path
Windows %LOCALAPPDATA%\n3memorycore-lite\
macOS ~/Library/Application Support/n3memorycore-lite/
Linux ~/.local/share/n3memorycore-lite/

Override with the N3MC_DATA_DIR environment variable.

Configuration

On first run, config.json is auto-generated with random UUIDs for owner_id and local_id. Editable defaults:

{
  "owner_id": "<uuid>",
  "local_id": "<uuid>",
  "redis_url": "redis://localhost:6379/0",
  "ttl_seconds": 604800,
  "dedup_threshold": 0.95,
  "half_life_days": 3,
  "bm25_min_threshold": 0.1,
  "search_result_limit": 20,
  "min_score": 0.2,
  "search_query_max_chars": 2000
}

redis_url can also be supplied via the N3MC_REDIS_URL environment variable (takes precedence over the config file).

Ranking formula

final_score = (0.7 * cosine_similarity + 0.3 * keyword_relevance) * time_decay

time_decay = 2 ^ (-days_elapsed / half_life_days)       (default half-life: 3 days)

With a default 3-day half-life (shorter than the 7-day TTL), time_decay is meaningful in the Lite build: a fresh memory scores 1.0, a 3-day-old one exactly 0.5, and a 7-day-old (near-expiry) entry ≈ 0.20 — pushing recent context ahead in the ranking.

Development

# Start Redis Stack first (see Prerequisites), then:
pip install -e ".[dev]"
pytest tests/ -q

Tests target Redis DB index 0 (configurable via N3MC_REDIS_TEST_URL) and FLUSHDB it before/after each test. RediSearch refuses to create indexes outside DB 0 (Cannot create index on db != 0), so a separate test DB isn't an option — run the test suite against a dedicated Redis container, never one that holds data you care about. Tests refuse to run if Redis isn't reachable.

Extending the Lite build

If you want to modify behavior (change the ranking formula, drop in a cross-encoder reranker, plug in a Japanese morphological tokenizer, etc.), start from the design spec shipped in this repository:

Appendix B of the spec lists optional extensions (cross-encoder reranker, save-time chunking, HyDE, Japanese morphological analysis) with drop-in points and library candidates. The spec gives an AI (or human) enough context to edit the code without breaking the TTL, dedup, or RediSearch contracts — it is the source of truth for design intent.

License

Apache License 2.0 — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

n3memorycore_mcp_lite-1.1.0.tar.gz (60.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

n3memorycore_mcp_lite-1.1.0-py3-none-any.whl (27.4 kB view details)

Uploaded Python 3

File details

Details for the file n3memorycore_mcp_lite-1.1.0.tar.gz.

File metadata

  • Download URL: n3memorycore_mcp_lite-1.1.0.tar.gz
  • Upload date:
  • Size: 60.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for n3memorycore_mcp_lite-1.1.0.tar.gz
Algorithm Hash digest
SHA256 f30cea235b25a1c9af299e62c7e9c104787c6d90ce8423a29c9e7fdd648e7221
MD5 dbd8c170b7886b08cb252e472a5dc535
BLAKE2b-256 47049dae900ee2c56bf627cfdda361d824539f0aa90b3c95ef039dc5a2d3d837

See more details on using hashes here.

Provenance

The following attestation bundles were made for n3memorycore_mcp_lite-1.1.0.tar.gz:

Publisher: workflow.yml on NeuralNexusNote/n3mcmcp-lite

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file n3memorycore_mcp_lite-1.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for n3memorycore_mcp_lite-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1d77b037cc812ebaf4e63587847a5dce78de471622eec8d7c6b572f03b97dc95
MD5 bfa0dd12f69206296dc3dc7a949e7d98
BLAKE2b-256 cc3ee9ec84f66a5fa6f86fe6e2361bbef12901bb77a6f87f2eaae709aaaca724

See more details on using hashes here.

Provenance

The following attestation bundles were made for n3memorycore_mcp_lite-1.1.0-py3-none-any.whl:

Publisher: workflow.yml on NeuralNexusNote/n3mcmcp-lite

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page