Skip to main content

The intelligence layer for AI memory — scoring, causal inference, lifecycle management, and active forgetting

Project description

PyPI

Genesys

The intelligence layer for AI memory.

Scoring engine + causal graph + lifecycle manager for AI agent memory. Speaks MCP natively.

What is this

Genesys is not another vector database. It's a scoring engine + causal graph + lifecycle manager that makes AI memory actually work. Memories are scored by a multiplicative formula (relevance × connectivity × reactivation), connected in a causal graph, and actively forgotten when they become irrelevant. It plugs into any storage backend and speaks MCP natively.

Why

  • Flat memory doesn't scale. Dumping everything into a vector store gives you recall with zero understanding. The 500th memory buries the 5 that matter.
  • No forgetting = no intelligence. Real memory systems forget. Without active pruning, your AI drowns in stale context.
  • No causal reasoning. Vector similarity can't answer "why did I choose X?" — you need a graph.

Your AI remembers everything but understands nothing. Genesys fixes that.

Quick Start

Option 1: In-Memory (zero dependencies)

The fastest way to try Genesys. No database required — state is kept in memory and optionally persisted to a JSON file.

pip install genesys-memory
cp .env.example .env
# Set OPENAI_API_KEY in .env

uvicorn genesys.api:app --port 8000

To persist across restarts, set GENESYS_PERSIST_PATH in .env:

GENESYS_PERSIST_PATH=.genesys_state.json

Give this to Claude to set it up for you: "Install genesys-memory, create a .env with my OpenAI key, start the server on port 8000 with the in-memory backend, and connect it as an MCP server."

Option 2: Postgres + pgvector (production)

Persistent, scalable storage with vector search via pgvector.

pip install 'genesys-memory[postgres]'
cp .env.example .env

Edit .env:

OPENAI_API_KEY=sk-...
GENESYS_BACKEND=postgres
DATABASE_URL=postgresql://genesys:genesys@localhost:5432/genesys

Start Postgres and run migrations:

docker compose up -d postgres
alembic upgrade head
GENESYS_BACKEND=postgres uvicorn genesys.api:app --port 8000

Give this to Claude to set it up for you: "Install genesys-memory[postgres], start a Postgres container with pgvector using docker compose, run alembic migrations, create a .env with my OpenAI key and DATABASE_URL, start the server with GENESYS_BACKEND=postgres, and connect it as an MCP server."

Option 3: Obsidian Vault (local-first)

Turns your Obsidian vault into a Genesys memory store. Markdown files become memory nodes, [[wikilinks]] become causal edges. A SQLite sidecar (.genesys/index.db) handles indexing.

pip install 'genesys-memory[obsidian]'
cp .env.example .env

Edit .env:

OPENAI_API_KEY=sk-...
GENESYS_BACKEND=obsidian
OBSIDIAN_VAULT_PATH=/path/to/your/vault

Start the server:

uvicorn genesys.api:app --port 8000

On first start, Genesys indexes all .md files in the vault and generates embeddings. A file watcher re-indexes incrementally when you edit notes.

If OBSIDIAN_VAULT_PATH is not set, Genesys auto-detects by looking for .obsidian/ in ~/Documents/personal, ~/Documents/Obsidian, and ~/obsidian.

Fully local (no API keys)

Use the local embedding provider to run Obsidian mode with zero external dependencies:

pip install 'genesys-memory[obsidian,local]'
GENESYS_BACKEND=obsidian
GENESYS_EMBEDDER=local
OBSIDIAN_VAULT_PATH=/path/to/your/vault
# No OPENAI_API_KEY needed
uvicorn genesys.api:app --port 8000

This uses all-MiniLM-L6-v2 (384-dim) via sentence-transformers for embeddings. The model is downloaded on first use (~80 MB).

Connect Claude Desktop — add to your claude_desktop_config.json:

{
  "mcpServers": {
    "genesys": {
      "url": "http://localhost:8000/mcp"
    }
  }
}

Or for Claude Code:

claude mcp add --transport http genesys http://localhost:8000/mcp

Give this to Claude to set it up for you: "Install genesys-memory[obsidian,local], create a .env with GENESYS_BACKEND=obsidian, GENESYS_EMBEDDER=local, and OBSIDIAN_VAULT_PATH to my vault at [YOUR_VAULT_PATH], start the server on port 8000, and connect it as an MCP server. No API keys needed."

Option 4: FalkorDB (graph-native)

Uses FalkorDB (Redis-based graph database) for native graph traversal.

pip install 'genesys-memory[falkordb]'
cp .env.example .env

Edit .env:

OPENAI_API_KEY=sk-...
GENESYS_BACKEND=falkordb
FALKORDB_HOST=localhost

Start FalkorDB and the server:

docker compose up -d falkordb
uvicorn genesys.api:app --port 8000

Give this to Claude to set it up for you: "Install genesys-memory[falkordb], start a FalkorDB container using docker compose, create a .env with my OpenAI key and GENESYS_BACKEND=falkordb, start the server on port 8000, and connect it as an MCP server."

From source

git clone https://github.com/rishimeka/genesys.git
cd genesys
pip install -e '.[dev]'

Connect to your AI

Claude Code

claude mcp add --transport http genesys http://localhost:8000/mcp

Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "genesys": {
      "url": "http://localhost:8000/mcp"
    }
  }
}

Any MCP client

Point your client at the MCP endpoint:

http://localhost:8000/mcp

MCP Tools

Tool Description
memory_store Store a new memory, optionally linking to related memories
memory_recall Recall memories by natural language query (vector + graph)
memory_search Search memories with filters (status, date range, keyword)
memory_traverse Walk the causal graph from a given memory node
memory_explain Explain why a memory exists and its causal chain
memory_stats Get memory system statistics
pin_memory Pin a memory so it's never forgotten
unpin_memory Unpin a previously pinned memory
delete_memory Permanently delete a memory
list_core_memories List core memories, optionally filtered by category
set_core_preferences Set user preferences for core memory categories

How it works

Every memory is scored by three forces multiplied together:

decay_score = relevance × connectivity × reactivation
  • Relevance decays over time. Old memories fade unless reinforced.
  • Connectivity rewards memories with many causal links. Hub memories survive.
  • Reactivation boosts memories that keep getting recalled. Frequency matters.

Because the formula is multiplicative, a memory must score on all three axes to survive. A highly connected but never-accessed memory still decays. A frequently recalled but causally orphaned memory still fades.

                    ┌─────────┐
                    │  STORE  │
                    └────┬────┘
                         │
                    ┌────▼────┐
                    │ ACTIVE  │◄──── reactivation
                    └────┬────┘
                         │ decay
                    ┌────▼────┐
                    │ DORMANT │
                    └────┬────┘
                         │ continued decay
                    ┌────▼────┐
           ┌────────│ FADING  │
           │        └─────────┘
           │ score=0, orphan,
           │ not pinned
      ┌────▼────┐
      │ PRUNED  │
      └─────────┘

Memories can also be promoted to core status — structurally important memories that are auto-pinned and never pruned.

Benchmark Results

Tested on the LoCoMo long-conversation memory benchmark (1,540 questions across 10 conversations, category 5 excluded):

Category J-Score
Single-hop 94.3%
Temporal 87.5%
Multi-hop 69.8%
Open-domain 91.7%
Overall 89.9%

Answer model: gpt-4o-mini | Judge model: gpt-4o-mini | Retrieval k=20

Full results and reproduction steps in benchmarks/.

Storage backends

Backend Install Use case
memory Built-in Zero deps, try it out
postgres + pgvector pip install 'genesys-memory[postgres]' Persistent, scalable
Obsidian vault pip install 'genesys-memory[obsidian]' Local-first knowledge base
FalkorDB pip install 'genesys-memory[falkordb]' Graph-native traversal
Custom Bring your own Implement GraphStorageProvider

Configuration

Copy .env.example to .env and set:

Variable Required Description
OPENAI_API_KEY Unless GENESYS_EMBEDDER=local Embeddings
ANTHROPIC_API_KEY No LLM memory processing (consolidation, contradiction detection)
GENESYS_BACKEND No memory (default), postgres, obsidian, or falkordb
GENESYS_EMBEDDER No openai (default) or local (sentence-transformers, no API key)
DATABASE_URL If postgres Postgres connection string
OBSIDIAN_VAULT_PATH If obsidian Path to your Obsidian vault
FALKORDB_HOST If falkordb FalkorDB host (default: localhost)
GENESYS_USER_ID No Default user ID for single-tenant mode

See .env.example for all options.

Contributing

See CONTRIBUTING.md.

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

genesys_memory-0.3.1.tar.gz (253.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

genesys_memory-0.3.1-py3-none-any.whl (99.7 kB view details)

Uploaded Python 3

File details

Details for the file genesys_memory-0.3.1.tar.gz.

File metadata

  • Download URL: genesys_memory-0.3.1.tar.gz
  • Upload date:
  • Size: 253.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for genesys_memory-0.3.1.tar.gz
Algorithm Hash digest
SHA256 ea9f35c4bfcd471dbb665cea362a90571ae221e66ed5ce8b0e74bb57468c2a85
MD5 3e30b60d1963d031f5a51e8ba9d1036b
BLAKE2b-256 04925dc1d7bbc3461f30ec297f665a30ae7f13ef7b27e1d8d1cb5d354f32eb99

See more details on using hashes here.

Provenance

The following attestation bundles were made for genesys_memory-0.3.1.tar.gz:

Publisher: publish.yml on rishimeka/genesys

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file genesys_memory-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: genesys_memory-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 99.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for genesys_memory-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a15bf0b84f1a37d4dd8fb311c34a17bb63aa4e2ece051eb996b00845acb547c6
MD5 421505ff876e4bfc5c9d245506ad9783
BLAKE2b-256 925f37eb38271efc4333eb071be162d37768b18406ba3cb2feb58f57bfafbb90

See more details on using hashes here.

Provenance

The following attestation bundles were made for genesys_memory-0.3.1-py3-none-any.whl:

Publisher: publish.yml on rishimeka/genesys

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page