Skip to main content

Persistent memory system for Cursor IDE — remembers context across sessions

Project description

cursor-mem

Persistent memory for Cursor IDE — automatically records session context and keeps memory across sessions.

中文说明 (README_CN.md)

Tired of re-explaining "where we left off" in every new chat? cursor-mem gives Cursor persistent memory: it automatically records your edits, shell commands, and MCP calls, then injects recent session summaries and key context into Cursor Rules so the next conversation picks up right away. No API key required — just pip install cursor-mem and cursor-mem install --global. When the agent needs details, it queries history on demand via MCP, so you say less and use fewer tokens. Like claude-mem, but built for Cursor and lighter.


Features

  • Cross-session memory: Remembers last session’s actions, edited files, and shell commands
  • Zero-config: Works out of the box with rule-based compression; no API key required
  • Optional AI summarization: Use any OpenAI-compatible API (e.g. free Gemini) for smarter summaries
  • Full-text search: FTS5 search over observations and sessions
  • MCP tools: 3-layer search workflow (~10x token savings) — memory_search (compact index), memory_timeline (anchor context), memory_get (full details), plus memory_important (workflow guide)
  • Web viewer: Browse memory stream in the browser with live updates
  • Multi-project isolation: Separate memory per project

Comparison with claude-mem

criterion cursor-mem claude-mem
Target Cursor IDE only, native hooks Claude Code first, Cursor via adapter
Stack Python 3.10+, FastAPI, SQLite TypeScript/Bun, Express, SQLite + ChromaDB
Setup pip install cursor-memcursor-mem install Clone, build, plugin/marketplace or Cursor standalone setup
Out-of-the-box Works immediately with no API key (rule-based compression) AI processing is central; free tier needs Gemini/OpenRouter config
Codebase size ~20 core modules, single package 600+ files, plugin + worker + skills
Context injection .cursor/rules/cursor-mem.mdc (Cursor Rules) Same for Cursor; Claude Code uses additionalContext
Search SQLite FTS5 only (simple, no extra deps) FTS5 + ChromaDB vector search (hybrid)
Dependencies Python stdlib + FastAPI/Click/httpx Node/Bun, Claude Agent SDK, ChromaDB, etc.

When to choose cursor-mem: You use Cursor only, want minimal setup and no required API key, and prefer a small Python codebase. When to choose claude-mem: You use Claude Code or want vector/semantic search, token economics, or the full plugin ecosystem.


Quick start

# Install from PyPI
pip install cursor-mem

# One-shot setup (global; applies to all projects)
cursor-mem install --global

# Restart Cursor

From source (development):

pip install -e .
cursor-mem install --global

How it works

User submits prompt → beforeSubmitPrompt hook
  → init session + inject history into .cursor/rules/cursor-mem.mdc

Agent runs → afterShellExecution / afterFileEdit / afterMCPExecution hooks
  → capture operations, compress, store in SQLite

Agent stops → stop hook
  → generate session summary + refresh context file for next session

Commands

cursor-mem install [--global]   # Install hooks + start worker
cursor-mem start                # Start worker
cursor-mem stop                 # Stop worker
cursor-mem restart              # Restart worker
cursor-mem status               # Show status

cursor-mem config set <key> <val>   # Set config
cursor-mem config get [key]         # Show config

cursor-mem data stats             # Data stats
cursor-mem data projects          # List projects
cursor-mem data cleanup           # Clean old data
cursor-mem data export [file]     # Export data

Optional AI summarization

# Gemini (free tier)
cursor-mem config set ai.enabled true
cursor-mem config set ai.base_url "https://generativelanguage.googleapis.com/v1beta/openai"
cursor-mem config set ai.api_key "your-gemini-api-key"
cursor-mem config set ai.model "gemini-2.0-flash"

# Or any OpenAI-compatible API
cursor-mem config set ai.base_url "https://api.openai.com/v1"
cursor-mem config set ai.api_key "sk-..."
cursor-mem config set ai.model "gpt-4o-mini"

Web viewer

After install, open http://127.0.0.1:37800 for:

  • Session list and details
  • Observation timeline
  • Full-text search
  • Live SSE updates

MCP tools (3-layer workflow)

Registered in ~/.cursor/mcp.json on install. Follow the 3-layer pattern for ~10x token savings:

  1. memory_important — Workflow guide (always visible). Read first.
  2. memory_search(query, project?, type?, limit?, offset?, dateStart?, dateEnd?, orderBy?)Step 1: Compact index (ID, time, title, type). ~50–100 tokens/result.
  3. memory_timeline(anchor?, depth_before?, depth_after?, query?, session_id?, project?, limit?)Step 2: Context around an observation. Use anchor (observation ID) with depths. ~100–200 tokens/entry.
  4. memory_get(ids, orderBy?, limit?)Step 3: Full details only for filtered IDs. ~500–1000 tokens/observation.

Never fetch full details without filtering via search/timeline first.


Project layout

cursor-mem/
├── cli.py           # CLI entry
├── installer.py     # Install logic
├── hook_handler.py  # Unified hook handler
├── config.py        # Config and paths
├── worker/          # FastAPI HTTP service
├── storage/         # SQLite layer
├── context/         # Context build and inject
├── summarizer/      # Rule-based + AI summarizer
├── mcp/             # MCP search tools
├── ui/              # Web viewer
├── pyproject.toml
└── README.md

Documentation

Doc English 中文
Design DESIGN.md DESIGN_CN.md
Roadmap ROADMAP.md ROADMAP_CN.md
User manual USER_MANUAL.md USER_MANUAL_CN.md

Testing

  • Automated: pip install -e ".[dev]" then pytest tests/ -v
  • In Cursor: See TESTING.md for manual test cases (hooks, MCP, worker, CLI).

License

This project is licensed under the Apache License 2.0. See LICENSE for the full text.


Data locations

  • Database: ~/.cursor-mem/cursor-mem.db
  • Config: ~/.cursor-mem/config.json
  • Logs: ~/.cursor-mem/logs/
  • Injected context: <project>/.cursor/rules/cursor-mem.mdc

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cursor_mem-0.2.0.tar.gz (43.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cursor_mem-0.2.0-py3-none-any.whl (43.2 kB view details)

Uploaded Python 3

File details

Details for the file cursor_mem-0.2.0.tar.gz.

File metadata

  • Download URL: cursor_mem-0.2.0.tar.gz
  • Upload date:
  • Size: 43.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for cursor_mem-0.2.0.tar.gz
Algorithm Hash digest
SHA256 03cbdb7d2fdf8fcd3439b079d8be56d312856165aaa2ca2c5771b4f0595043b5
MD5 0549ab0a84229d1e5f90469d945cc678
BLAKE2b-256 15a5bc845f753465fbc7d08c8a017edfa3c431183263f0a8e7851f494e1371cb

See more details on using hashes here.

File details

Details for the file cursor_mem-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: cursor_mem-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 43.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for cursor_mem-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b8d379b8d9309c2c46e2b030a7e2dcc551c9341265a0d8316bba46bc72f8b3fe
MD5 f3bfd0d2d91c515a172e39ed458cf596
BLAKE2b-256 fda8c843fc9c339aa1ea14b8c442acd7a8ed054e1c2271641890934a733e6fee

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page