Skip to main content

OpenLynx — persistent semantic memory for Claude Code, Codex CLI and other coding agents. Auto-saves and recalls conversation history across sessions.

Project description

lynx-memory

中文 README

Persistent, semantic, long-term memory for Claude Code. Conversations are auto-saved across sessions and the most relevant snippets are injected into context whenever you start a new prompt — no special syntax, no "remember this" phrasing required.

You      : What can I do tomorrow if the weather's nice — maybe walk the dog?
Claude   : Since you've got Dandan (your golden Border Collie) who needs a lot
           of exercise, try a long walk, frisbee, or a bike ride with him
           tagging along… 🐶
            (you never mentioned Dandan or owning a dog — memory recalled it
             from a past chat)

How it works

Three Claude Code hooks + a small Python service:

Hook What it does
UserPromptSubmit Embeds your prompt and injects the top-K most similar prior turns. When a turn has a Haiku-generated summary, the summary is injected instead of the raw prose.
Stop Persists the current user/assistant turn into SQLite + Chroma, then spawns a detached background process that asks Haiku to summarize the turn (no extra API key needed — reuses your claude CLI session).
SessionEnd Asks Claude Haiku to produce a coarse summary of the whole session.

Storage:

  • SQLite — source of truth for raw turns, per-turn Haiku summaries, and session summaries
  • Chroma — local vector index over turns + summaries
  • Voyage AI (voyage-3) — embeddings
  • Claude Haiku (claude-haiku-4-5-20251001) — per-turn summarization, called via claude -p so no extra ANTHROPIC_API_KEY is required

Install

pip install lynx-memory
lynx-memory init

init will:

  1. Create ~/.claude/lynx-memory/ (data directory)
  2. Prompt for your VOYAGE_API_KEY (get one free at https://www.voyageai.com/)
  3. Write the default .env (MIN_SCORE=0.7, SUMMARY_ENABLED=1, SUMMARY_MODEL=claude-haiku-4-5-20251001, SUMMARY_BACKEND=auto) — the per-turn Haiku summarizer reuses your existing claude CLI session, so no extra ANTHROPIC_API_KEY is required by default
  4. Back up your existing ~/.claude/settings.json and add the three hooks
  5. Print verification steps

Then open a fresh Claude Code session, chat for a few turns, and run:

lynx-memory status

You should see turns and chroma_turns counters going up.

Codex CLI (cross-host memory)

Same memory store, also wired into Codex CLI:

lynx-memory init --target codex   # or --target all to install both

This writes ~/.codex/hooks.json, sets [features] codex_hooks = true in ~/.codex/config.toml, and registers three hooks (UserPromptSubmit → inject, Stop → persist, SessionStart → summarize the previous session since Codex has no SessionEnd event).

Codex's additionalContext field is fully respected, so retrieved memory is injected exactly like in Claude Code. Restart any running codex process for hooks to take effect — they're loaded at session start.

A turn typed in Claude Code can be recalled inside Codex (and vice versa) because both write to the same SQLite + Chroma store at ~/.claude/lynx-memory/.

CLI

lynx-memory init           Install hooks and slash commands
lynx-memory init-project   Create a .lynx-memory/ marker in cwd to enable
                             project-level storage
lynx-memory status         Show data dir, hook registration, DB stats
lynx-memory doctor         Verify Python, deps, API key, settings.json
lynx-memory merge          Merge memory between the project and global stores
                             (--from / --to is project|global, with --dry-run)
lynx-memory delete         Permanently delete memory for a scope
                             (--scope project|global|both, with double confirm)
lynx-memory uninstall      Remove hooks and slash commands (keeps your data)

Slash commands

lynx-memory init also installs five global slash commands into ~/.claude/commands/, callable from any Claude Code session:

Command What it does
/lynx-memory-status Show current scope (project vs global) with stats for both
/lynx-memory-pull-global Merge global memory into the current project (global → proj)
/lynx-memory-push-global Merge current project memory into global (proj → global)
/lynx-memory-delete Delete memory with mandatory double confirm (DELETE + y)
/lynx-memory-history Open a local Web UI to browse, search, tag, and delete turns

Each of these runs lynx-memory status / merge --dry-run first and asks for your approval before any write or destructive action.

Web UI

Type /lynx-memory-history in Claude Code (or run lynx-memory web) to launch a local FastAPI + React UI on 127.0.0.1. The page opens automatically in your browser and lets you:

  • Switch between project and global scopes
  • Page through every saved turn
  • Search by keyword (SQL LIKE) or semantic similarity (Voyage embeddings)
  • Tag turns (e.g. #work, #personal) and filter by tag
  • Delete a single turn (also clears its embedding from Chroma)
  • See the per-turn Haiku summary above each turn, with a one-click button to (re)generate it on demand

Usage

# default — listens on http://127.0.0.1:9527 and opens your browser
lynx-memory web

# pick a different port
lynx-memory web --port 8080

# or let the OS assign a free port
lynx-memory web --port 0

# don't auto-open the browser (useful in headless / SSH sessions)
lynx-memory web --no-open
Action What happens on disk
Delete a turn Row removed from SQLite turns and turn_tags; embedding removed from Chroma
Add a tag Inserted into SQLite tags (created on demand) and turn_tags
Remove a tag Row removed from turn_tags; orphaned tag is GC'd from tags
Search (keyword) SQL LIKE over user_msg and assistant_msg — no embedding call
Search (semantic) One Voyage embedding per query, then top-K from Chroma
Regenerate summary One claude -p call (Haiku); writes summary / summary_model / summary_ts back into the turns row

The server only binds to 127.0.0.1. Press Ctrl+C to stop it.

Project-level vs global

Memory is global by default. Run this in a project root:

cd ~/code/my-project
lynx-memory init-project

It creates a .lynx-memory/ marker. As long as your cwd is inside that project, memory transparently switches to the project-level store at <project>/.lynx-memory/db/, isolated from the global one at ~/.claude/lynx-memory/.

Use /lynx-memory-status to inspect the active scope, and /lynx-memory-pull-global / /lynx-memory-push-global to move history between the two layers.

Configuration

All optional, set in ~/.claude/lynx-memory/.env:

Variable Default Purpose
VOYAGE_API_KEY Required for embeddings
TOP_K 5 Max memories injected per prompt
MIN_SCORE 0.7 Cosine similarity floor (0–1)
SUMMARY_ENABLED 1 Set 0/false to disable per-turn Haiku summarization
SUMMARY_MODEL claude-haiku-4-5-20251001 Model used for per-turn summaries
SUMMARY_BACKEND auto auto → CLI when claude is on PATH, else SDK; force with cli or sdk
SUMMARY_TIMEOUT 60 Seconds before the claude -p subprocess is killed
ANTHROPIC_API_KEY Only needed when SUMMARY_BACKEND=sdk (CLI backend reuses your existing claude auth)
LYNX_MEMORY_DIR ~/.claude/lynx-memory Where SQLite + Chroma live
LYNX_MEMORY_SUMMARY_MODEL claude-haiku-4-5-20251001 Model used by SessionEnd

Optional: MCP server

You can also expose memory as MCP tools for Claude Code (search_memory, list_recent, stats, forget). Add to ~/.claude.json or .mcp.json:

{
  "mcpServers": {
    "lynx-memory": {
      "command": "lynx-memory-mcp"
    }
  }
}

Uninstall

lynx-memory uninstall                   # remove hooks + slash commands
lynx-memory delete --scope global       # delete the global store (confirms)
# or
rm -rf ~/.claude/lynx-memory            # nuke directly (irreversible)

Privacy

  • All data stays on your machine in ~/.claude/lynx-memory/.
  • Outbound calls: Voyage AI for embeddings (your prompt text), Anthropic for per-turn Haiku summaries (default; goes through your existing claude CLI session — no extra key) and end-of-session summaries.
  • Set SUMMARY_ENABLED=0 if you don't want per-turn summaries to leave the box.
  • Set LYNX_MEMORY_DIR to encrypt at rest with whatever filesystem-level encryption your OS provides.

Roadmap

  • Project-level / global dual-layer storage Global by default; auto-switches to project-level when a .lynx-memory/ marker is found by walking up from cwd, so histories from different projects don't bleed into each other. Run lynx-memory init-project in a project root to create the marker. Search supports scope=auto|project|global|merged (hooks via LYNX_MEMORY_SCOPE env; MCP tools accept a scope argument).

  • Multi-CLI client support Extend beyond Claude Code to Cursor CLI, Codex CLI, Gemini CLI. Provide lynx-memory install --client <name> to write MCP configs in one shot, with rules templates that force consistent recall on each client.

  • Import / export & cross-device sync lynx-memory export / import for JSONL backup and restore; place db/ in iCloud / Dropbox / a Git repo, or use a built-in lynx-memory sync subcommand to share memory across machines.

  • Local Web UI memory browser A local FastAPI + React UI with paging, keyword / semantic search, single-turn deletion, and tagging (e.g. #work, #personal). Launch with /lynx-memory-history (or lynx-memory web); the page exposes both project-level and global histories with a one-click scope toggle.

License

MIT — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openlynx-0.2.0.tar.gz (139.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openlynx-0.2.0-py3-none-any.whl (143.8 kB view details)

Uploaded Python 3

File details

Details for the file openlynx-0.2.0.tar.gz.

File metadata

  • Download URL: openlynx-0.2.0.tar.gz
  • Upload date:
  • Size: 139.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for openlynx-0.2.0.tar.gz
Algorithm Hash digest
SHA256 df6c4f09e8cb564348af837918c48842f5efb55192e3f59e09f1b11d65bdbbab
MD5 88565e6384e3f734492c798ff05e9af6
BLAKE2b-256 5f866969de08cb102507d4c77732438db041594c23ed1670de155b4192ae1239

See more details on using hashes here.

File details

Details for the file openlynx-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: openlynx-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 143.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for openlynx-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 eeb6713aab243e546d8b7a3c8edb708f733b3d039cb603546f1fded1e1cdb94b
MD5 d4bd8bfd4134b6f667b266814849d23e
BLAKE2b-256 0e042dffdfb341832cfca7aa976e080bf861b4f045133f63312e2433ef507cf4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page