Skip to main content

Cross-platform, cross-agent memory system using FalkorDB graph database

Project description

FalkorDB Memory

The Graph-Native Memory System for Software Development


"Why did we choose PostgreSQL?" "Have we seen this bug before?" "What's the pattern for this API?"

Every developer knows these questions. The answers live in past conversations — buried in chat history, lost between sessions, scattered across tools. FalkorDB Memory gives your AI agents a persistent brain that remembers decisions, traces code patterns, and connects knowledge across projects.

One graph database — FalkorDB combines vector search + graph relationships in a single Redis-based store. No separate ChromaDB + SQLite. No context fragmentation.

Native graph queries — Trace decision chains, find problem patterns, discover hidden connections with Cypher: MATCH (d:Decision)-[:LED_TO]->(p:Problem) RETURN d.rationale, p.solution

Zero-cost extraction — Pure regex matching, no API calls, 96.6% benchmark accuracy. Decisions, bugs, learnings, patterns — auto-classified and stored.



Quick Start · Features · Why Graph? · MCP Tools · Roadmap


One graph. Zero fragmentation. Native connections.

15+
MCP Tools
Graph+Vector
One Database
$0
Extraction Cost
Cypher
Native Queries

vs Alternatives

Feature FalkorDB Memory MemPalace claude-mem
Storage One graph DB ChromaDB + SQLite SQLite + ChromaDB
Graph queries Native Cypher Simulated None
Vector search Built-in ChromaDB ChromaDB
Relationship traversal Native edges Logic layer None

Quick Start

# Install dependencies
pip install falkordb-memory-server

# Initialize settings (Select Programming/Office/Life mode)
python -m falkordb_memory_server init

# Start FalkorDB (Docker)
docker run -d -p 6379:6379 falkordb/falkordb:latest

# Configure for Claude Code
claude mcp add falkordb-memory -- python -m falkordb_memory_server

# Configure for OpenClaw
curl -fsSL https://raw.githubusercontent.com/goodideal/falkordb-memory/main/scripts/install_openclaw.sh | bash

# Or configure for Gemini CLI
gemini extensions install https://github.com/goodideal/falkordb-memory

After installation, your AI agents automatically:

  • Capture memories from conversations and tool usage
  • Inject context from relevant past memories at session start
  • Search semantically across all stored knowledge

Features

🧠 Memory Extraction

Automatic classification of memories into 7 categories:

Category Description Example Patterns
decision Choices made, trade-offs "We decided to...", "instead of"
preference Coding style, conventions "I always use...", "never do"
milestone Achievements, breakthroughs "It worked!", "fixed", "solved"
problem Bugs, issues, failures "error", "crash", "bug"
learning Insights, patterns, tips "learned", "insight", "tip"
emotional Feelings, reactions "love", "proud", "frustrated"
general Default category -

Extraction Modes Optimize extraction rules based on your context:

  • Programming (Default): Best for software architecture, code snippets & technical decisions.
  • Office: Best for scheduling, meeting summaries, emails & daily tasks. Automatically redacts Phone numbers, ID cards, and Bank cards.
  • Life: Best for personal reminders, home tasks & finances.

Zero API calls — Pure regex pattern matching achieves 96.6% LongMemEval accuracy (MemPalace benchmark).

🔗 Graph Storage

Memories are nodes. Relationships are edges. Discover connections:

// Find all decisions that led to problems
MATCH (d:Decision)-[:LED_TO]->(p:Problem)
RETURN d.rationale, p.solution

// Find related learnings
MATCH (l:Learning)-[:DERIVED_FROM]->(c:Code)
WHERE l.category = 'security'
RETURN l, c

🔍 Semantic Search

Vector similarity search powered by sentence-transformers:

# Find similar memories
results = memory_semantic_search(
    query="authentication best practices",
    k=10,
    min_score=0.7
)

💡 Proactive Insights

Shift from Passive Recall to Active Context Streaming — The system intercepts tool usage or user prompts and proactively pushes relevant historical architecture decisions to your agent's context window.

  • Turn-Driven Injection: Searches the graph for conventions related to the currently requested files or topics.
  • Async State Rendering: Automatically updates localized workspace files (.omg/state/active-architecture.md) to align the agent's system prompt without blocking execution.
  • Zero Context Pollution: Protects token limits with strict 1-hour session deduplication and confidence-based truncation.

🛡️ Privacy & Security Guardrails

Built-in PII and Secrets Redaction (PIIRedactor) ensures that sensitive data never leaks into the graph database.

  • Automated Secrets Filtering: Detects and replaces AWS Keys, JWT tokens, private keys, and passwords with [REDACTED_SECRET].
  • PII Scrubbing: Automatically removes email addresses and public IP addresses.
  • Developer-Friendly Bypasses: Safely ignores common dummy passwords (like test, dummy) and standard local network IPs (127.0.0.1, 192.168.x.x) to prevent blocking legitimate local development.

🪝 Cross-Agent Hooks

Agent Hook Events Config
Claude Code PreToolUse, PostToolUse ~/.claude/settings.json
Gemini CLI BeforeTool, AfterTool, SessionStart gemini-extension.json

Unified event mapping:

Claude Code          Gemini CLI           Unified
───────────────────────────────────────────────────
PreToolUse      →   BeforeTool      →   before_tool
PostToolUse     →   AfterTool       →   after_tool
-               →   SessionStart    →   session_start

Installation

Prerequisites

  1. FalkorDB (Redis-based graph database):

    docker run -d -p 6379:6379 --name falkordb falkordb/falkordb:latest
    
  2. Python 3.10+

Option 1: Install from PyPI

# Using pipx (recommended)
pipx install falkordb-memory-server

# Or using uv
uv tool install falkordb-memory-server

# Or standard pip
pip install falkordb-memory-server

Option 2: Install from Source (Development)

git clone https://github.com/goodideal/falkordb-memory.git
cd falkordb-memory
pip install -e .

Configure Claude Code

# Add MCP server
claude mcp add falkordb-memory -e FALKORDB_URL=redis://localhost:6379 -- python -m falkordb_memory_server

# Verify connection
claude mcp list

Optional: Enable Hooks for Auto-Capture

Add hooks to ~/.claude/settings.json for automatic memory capture:

{
  "hooks": {
    "SessionStart": [{
      "matcher": ".*",
      "hooks": [{
        "type": "command",
        "command": "python /path/to/hooks/claude-code-hook.py"
      }]
    }],
    "PostToolUse": [{
      "matcher": ".*",
      "hooks": [{
        "type": "command",
        "command": "python /path/to/hooks/claude-code-hook.py"
      }]
    }]
  }
}

For Gemini CLI

# Add as MCP server to Gemini CLI
gemini mcp add falkordb-memory -- python3 -m falkordb_memory_server

# Or install as full extension
gemini extensions install https://github.com/goodideal/falkordb-memory

# Or link locally for development (uninstall first if previously installed)
gemini extensions uninstall falkordb-memory
gemini extensions link ./gemini-extension

MCP Tools

Memory Operations

Tool Description
memory_remember Store a new memory
memory_recall Search and retrieve memories
memory_associate Create relationships between memories
memory_forget Delete memories (supports TTL)

Extraction Tools

Tool Description
memory_extract Extract memories from text
memory_mine Batch mine memories from files/directories

Search Tools

Tool Description
memory_semantic_search Vector similarity search
memory_query Execute raw Cypher queries

Session Management

Tool Description
memory_start_session Begin a new agent session
memory_end_session End current session

Maintenance

Tool Description
memory_stats View memory statistics
memory_cleanup Clean expired memories

Usage Examples

Store a Memory

"Remember that we decided to use PostgreSQL for the database because it handles JSON better"

Note: When using memory_remember tool directly, content and context must be dictionaries, not strings:

{
  "memory_type": "decision",
  "content": {
    "title": "Database Choice",
    "description": "We chose PostgreSQL",
    "rationale": "Better JSON support"
  },
  "context": {
    "project": "my-project"
  }
}

Recall Memories

"What do you remember about authentication?"
"Find decisions we made about the API"

Extract from Files

"Mine the docs directory for important decisions"
"Extract learnings from this conversation"

Graph Queries

"Show me all problems we've solved"
"What decisions led to this architecture?"

Architecture

┌─────────────────────────────────────────────────────────────┐
│                     Agent Layer                              │
│   Claude Code │ Gemini CLI │ Codex │ Custom Agents          │
└─────────────────────┬───────────────────────────────────────┘
                      │ MCP Protocol / Hooks
                      ▼
┌─────────────────────────────────────────────────────────────┐
│              FalkorDB Memory MCP Server                      │
│  ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────┐│
│  │Graph Tools  │ │Memory Tools │ │ Extraction Tools        ││
│  │- query      │ │- remember   │ │- memory_extract         ││
│  │- create     │ │- recall     │ │- memory_mine            ││
│  │- delete     │ │- associate  │ │                         ││
│  └─────────────┘ └─────────────┘ └─────────────────────────┘│
└─────────────────────┬───────────────────────────────────────┘
                      │ falkordb-py SDK
                      ▼
┌─────────────────────────────────────────────────────────────┐
│                    FalkorDB (Redis)                          │
│  Nodes: Session, Agent, Decision, Learning, Code, Concept   │
│  Edges: STARTED_BY, LED_TO, KNOWS, RELATED_TO               │
│  Indexes: Vector (semantic), Full-text (search)             │
└─────────────────────────────────────────────────────────────┘

Configuration

Environment Variables

FALKORDB_URL=redis://localhost:6379
GRAPH_NAME=agent_memory
EMBEDDING_MODEL=all-MiniLM-L6-v2
DEFAULT_TTL=604800  # 7 days

Embedding Models

Local embedding models powered by sentence-transformers — no API key required.

Model Dimensions Size Speed Accuracy
all-MiniLM-L6-v2 384 ~90MB Fast Good
all-mpnet-base-v2 768 ~420MB Slower Better

How it works:

  • Models are auto-downloaded on first use to ~/.cache/torch/sentence_transformers/
  • After initial download, works completely offline — no API calls, no network required
  • Zero extraction cost — pure pattern matching for memory classification
  • Semantic search uses local embeddings for vector similarity

Install dependencies:

pip install sentence-transformers torch

Switch models:

# Use higher accuracy model
export EMBEDDING_MODEL=all-mpnet-base-v2
export EMBEDDING_DIMENSION=768

Roadmap

v1.0 (Foundation)

  • ✅ MCP server with graph operations
  • ✅ FalkorDB integration (Redis graph database)
  • ✅ Basic memory tools: remember, recall, forget
  • ✅ Node labels: Session, Agent, Decision, Learning, Code, Concept
  • ✅ Relationship types: STARTED_BY, LED_TO, DERIVED_FROM, KNOWS

v1.1 (Current)

  • ✅ Memory extraction with rule-based patterns
  • ✅ Claude Code and Gemini CLI hooks
  • ✅ Semantic search with local embeddings
  • ✅ Batch file/directory mining

v1.2 (Developer Core)

Solve everyday development pain points

  • ✅ Decision Chain Query — trace architecture decisions ("Why PostgreSQL?")
  • ✅ Bug Pattern Registry — remember solved problems ("Seen this error before?")
  • ✅ Code Pattern Memory — API usage and patterns ("React hooks best practices")
  • ✅ Privacy Control — <private> tags to exclude sensitive content
  • ✅ Wake-up Command — load project-critical facts into context
  • Incremental Mining — track file changes (hash/mtime), skip unchanged files, avoid duplicate storage

v1.3 (Project Knowledge)

Project-level context management

  • ✅ Project Context Auto-Load — conventions, configs, dependencies
  • ✅ Dependency Decision Memory — why we chose X over Y
  • ✅ Convention Registry — coding standards, naming patterns
  • ✅ Knowledge Graph — temporal entity relationships with Cypher

v1.4 (Dev Workflow Integration)

Integrate into development process

  • ✅ Refactor Trail — track refactoring reasons and impact
  • ✅ Test Memory — test strategies, edge cases, coverage decisions
  • ✅ API Usage Memory — third-party API experiences
  • ✅ Code Review Memory — review feedback and improvements

v1.5 (Enhanced Experience)

  • ✅ Web Viewer UI — graph visualization at localhost:37777
  • ✅ Endless Mode — biomimetic memory for long sessions (L0-L3 stack)
  • ✅ Progressive Disclosure — 3-layer search workflow
  • ✅ Conversation Mining — multi-format support (Claude, ChatGPT, Slack)

v2.0 (Enterprise)

  • 🔲 AAAK Dialect — compression for repeated entities at scale
  • 🔲 Advanced Graph Analytics — influence analysis, pattern detection, community detection
  • 🔲 Multi-tenant Support — isolated memory per team/project
  • 🔲 Team Memory Sharing — cross-team knowledge sync

Troubleshooting

First Run: Model Download

On first use, the embedding model will be downloaded automatically:

INFO:falkordb_memory_server.server:Loading embedding model: all-MiniLM-L6-v2

This is a one-time download (~90MB). After that, it works completely offline.

Cypher Syntax Notes

FalkorDB uses OpenCypher with some differences:

#  Not supported - datetime() function
CREATE (n:Decision {created_at: datetime()})

#  Correct - pass timestamp from Python
CREATE (n:Decision {created_at: "2024-01-15T10:30:00"})

MCP Server Not Connecting

  1. Verify FalkorDB is running:

    docker ps | grep falkordb
    
  2. Check MCP server status:

    claude mcp list
    
  3. Restart the MCP server:

    claude mcp remove falkordb-memory
    claude mcp add falkordb-memory -e FALKORDB_URL=redis://localhost:6379 -- python -m falkordb_memory_server
    

Import Errors

If you see ModuleNotFoundError:

# Install in development mode
pip install -e .

References

This project draws inspiration from:

  • MemPalace — Rule-based extraction patterns, 96.6% LongMemEval benchmark
  • Claude-Mem — Hook lifecycle design, MCP integration patterns

License

MIT License - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

falkordb_memory_server-0.1.1.tar.gz (155.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

falkordb_memory_server-0.1.1-py3-none-any.whl (91.9 kB view details)

Uploaded Python 3

File details

Details for the file falkordb_memory_server-0.1.1.tar.gz.

File metadata

  • Download URL: falkordb_memory_server-0.1.1.tar.gz
  • Upload date:
  • Size: 155.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.4 {"installer":{"name":"uv","version":"0.11.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for falkordb_memory_server-0.1.1.tar.gz
Algorithm Hash digest
SHA256 80e5fdf94ac9abeb828049dbcbec08c212387d4dc26eae6a43f040f7e4f37771
MD5 72bb85f5662638b8684ac4fb39200b69
BLAKE2b-256 7428c46c82a5c9a8c6eb1fa8d67a4a9391b9380e04996c264c03bece37946ac6

See more details on using hashes here.

File details

Details for the file falkordb_memory_server-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: falkordb_memory_server-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 91.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.4 {"installer":{"name":"uv","version":"0.11.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for falkordb_memory_server-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e9a2ce75c71ad8be9d38d9b5eab8b4a33be2b18f4ef281887aea98579415c80c
MD5 a9b1c26857d0bc9e3b850e60fdf004e5
BLAKE2b-256 388110f957c9f4362a4e6c455051b9b9705a4ac07b2fe59ed48285098cbba284

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page