Skip to main content

Obsidian-native AI memory palace with semantic search

Project description

Mnemos

Obsidian-native AI memory palace with semantic search

Your AI's memory lives in your Obsidian vault. Human-readable. Searchable. Yours.

MIT License Python 3.10+ Inspired by MemPalace


Why Mnemos?

MemPalace proved that structured memory architecture works for AI. Mnemos takes the same idea and makes it Obsidian-native — your memories are markdown files you can read, edit, and organize.

MemPalace Mnemos
Storage ChromaDB binary (opaque) Obsidian markdown (you can read it)
Mining English regex only Hybrid: regex + Claude API (any language)
Access AI only You AND your AI
Search Semantic Semantic + metadata + optional re-rank
Deletion API call Delete in Obsidian, auto-synced
Ecosystem Standalone Obsidian Graph View, Dataview, plugins

Quick Start

# Install
pip install git+https://github.com/mnemos-dev/mnemos.git

# Initialize your vault
mnemos init

# Connect to Claude Code
claude mcp add mnemos -- python -m mnemos --vault /path/to/your/vault

How It Works

Mnemos uses a Memory Palace architecture inspired by the ancient Greek method of loci:

Your Obsidian Vault
  +-- Mnemos/
      +-- Wings/              (projects & people)
      |   +-- ProjectA/
      |   |   +-- auth/           (topic rooms)
      |   |   |   +-- decisions/      (memory types)
      |   |   |   +-- facts/
      |   |   |   +-- problems/
      |   +-- ProjectB/
      +-- Identity/           (who you are - L0)
      +-- _recycled/          (soft-deleted memories)

Every memory is a .md file with YAML frontmatter. ChromaDB runs alongside as a vector index for fast semantic search. Obsidian is the master, ChromaDB is the index — if it's not in your vault, it doesn't exist.

MCP Tools

Mnemos exposes 8 tools via Model Context Protocol:

Tool Description
mnemos_search Semantic search with wing/room/hall filters
mnemos_add Add a new memory
mnemos_mine Extract memories from files or directories
mnemos_status Palace statistics
mnemos_recall Load context (L0 identity, L1 summaries, L2 details)
mnemos_graph Query entity relationships
mnemos_timeline Chronological entity history
mnemos_wake_up Session startup context (~200 tokens)

Works with Claude Code, Cursor, ChatGPT, and any MCP-compatible client.

Mining

Mnemos extracts memories using a hybrid approach:

  1. Regex patterns — detects decisions, problems, events, preferences in Turkish and English
  2. Claude API (optional) — catches what regex misses, works in any language
# Mine your session notes
mnemos mine Sessions/

# Use Claude API for better extraction
pip install mnemos[llm]
mnemos mine Sessions/ --llm

External Sources (Read-Only)

Mine data from outside your vault without modifying the source:

# Mine Claude Code memory (one-shot, read-only)
mnemos mine ~/.claude/projects/my-project/memory --external

External sources are mined once to extract memories into your palace. The source files are never modified or watched.

File Watcher

Changes you make in Obsidian are automatically synced to ChromaDB:

Action Result
Add a note Indexed in ChromaDB
Edit a note Re-indexed
Delete a note Removed from index
Move a note Metadata updated

The watcher runs inside the MCP server. When the server restarts, it detects any changes made while it was offline.

Memory Stack (L0-L3)

Efficient context loading — your AI knows you without wasting tokens:

Level Content Tokens Loaded
L0 Identity ~50 Every session
L1 Wing summaries ~150 Every session
L2 Room details ~300-500 When topic mentioned
L3 Deep search ~200-400 When asked

Configuration

mnemos.yaml in your vault root:

version: 1
vault:
  path: "/path/to/your/vault"
mining:
  languages: [tr, en]
  use_llm: false
  sources:
    - path: "Sessions/"
      mode: sessions
    - path: "Topics/"
      mode: general
halls:
  - decisions
  - facts
  - events
  - preferences
  - problems

Architecture

Claude Code / Cursor / ChatGPT
        |
        | MCP (stdio)
        v
  +------------------+
  |  Mnemos Server   |
  |  (8 MCP tools)   |
  +-----|------|------+
        |      |
   ChromaDB   SQLite
   (vectors)  (knowledge graph)
        |      |
        v      v
  +-----------------------+
  |   Obsidian Vault      |
  |   (.md files = truth) |
  +-----------------------+

Contributing

Contributions welcome! This project is built from scratch (not a fork) — inspired by MemPalace's palace architecture.

git clone https://github.com/mnemos-dev/mnemos.git
cd mnemos
pip install -e ".[dev,llm]"
pytest tests/ -v

License

MIT — Copyright 2026 Tugra Demirors / GYP Energy

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mnemos_dev-0.1.0.tar.gz (164.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mnemos_dev-0.1.0-py3-none-any.whl (29.8 kB view details)

Uploaded Python 3

File details

Details for the file mnemos_dev-0.1.0.tar.gz.

File metadata

  • Download URL: mnemos_dev-0.1.0.tar.gz
  • Upload date:
  • Size: 164.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for mnemos_dev-0.1.0.tar.gz
Algorithm Hash digest
SHA256 ddafff6537eaf941c30185426a3e9a3cadcc59c2c75270a27bec40e46b5dfa32
MD5 b524992563575c50ac99d645e99f46ec
BLAKE2b-256 06aed4ffe5e686e1cd5a26b239ddfddea4793639a359ee9796230adb510ef458

See more details on using hashes here.

File details

Details for the file mnemos_dev-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: mnemos_dev-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 29.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for mnemos_dev-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bdfbc7ef77f25aae307ced7a9ce8a798d5392a554b919e946a3a190c5c7482b4
MD5 7b1266428ba534cd260eff6dbc8989ac
BLAKE2b-256 509f8117e0b2d816aef22dcb11b7b4336531b66750c24f7ea81ee91011620bb1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page