Skip to main content

Weave raw knowledge into structured Obsidian wikis, powered by Claude Code

Project description

mindloom

Weave raw knowledge into structured Obsidian wikis, powered by Claude Code.

Inspired by Andrej Karpathy's LLM Knowledge Bases workflow. You curate (paste a link). Claude Code thinks (compile, cross-link, answer, lint).

Install

# Recommended: global install via uv (works from anywhere)
uv tool install mindloom

# Optional: browser extra for JS-heavy sites
uv tool install "mindloom[browser]"

Requires: Python 3.12+, Claude Code installed for compile / ask / lint.

How it works

loom add <url> -v ~/my-wiki
  ├─ 1. fetch: trafilatura / playwright / pymupdf (PDF)   (Python)
  ├─ 2. images downloaded, paths rewritten                 (Python)
  ├─ 3. saved to raw/ with YAML frontmatter                (Python)
  └─ 4. claude -p "compile this into the wiki"             (Claude Code)
         ├─ reads CLAUDE.md (the rules)
         ├─ reads _index.md (what exists)
         ├─ uses loom search + Grep to find related articles
         ├─ writes/updates wiki/ articles with [[wikilinks]]
         └─ updates _index.md

Usage

Every command (except init) requires --vault / -v pointing to your vault.

# Create a vault (open it in Obsidian)
loom init ~/my-wiki

# Add articles — fetches + auto-compiles
loom add "https://arxiv.org/abs/..." -t "transformers, attention" -v ~/my-wiki
loom add "https://blog.example.com" -t "rl, rlhf" -v ~/my-wiki
loom add "https://..." --no-compile -v ~/my-wiki   # just fetch, compile later

# Compile pending raw articles
loom compile -v ~/my-wiki
loom compile --full -v ~/my-wiki                    # recompile everything

# Ask questions (Claude Code researches your wiki)
loom ask "How does flash attention work?" -v ~/my-wiki
loom ask "Compare RLHF vs DPO" -o markdown -v ~/my-wiki   # save as .md
loom ask "Overview of transformers" -o marp -v ~/my-wiki   # save as slideshow

# Search (BM25-ranked via bb25)
loom search "attention mechanism" -v ~/my-wiki
loom search "attention mechanism" -n 5 -v ~/my-wiki        # limit results

# Rebuild search index from scratch
loom reindex -v ~/my-wiki

# Health check
loom lint -v ~/my-wiki

# Vault info
loom status -v ~/my-wiki

# Open in Obsidian
loom open -v ~/my-wiki                              # opens index
loom open wiki/attention.md -v ~/my-wiki            # opens specific note

Python API

All functions are importable and return dicts/values (no printing, no sys.exit).

from mindloom import (
    init_vault, add, compile_vault, ask,
    search, reindex, lint, status,
)

# Create a vault
vault = init_vault("~/my-wiki")            # returns Path

# Ingest a URL (fetch + save to raw/)
result = add(
    "https://arxiv.org/abs/2405.04434",
    vault="~/my-wiki",
    tags=["transformers", "attention"],
    compile_after=True,                    # auto-compile via Claude Code
)
print(result["title"], result["rel_path"]) # "Article Title" "raw/slug.md"

# Compile pending raw articles
compile_vault("~/my-wiki")                 # only uncompiled
compile_vault("~/my-wiki", full=True)      # recompile everything

# Ask questions (Claude Code researches the wiki)
answer = ask("How does flash attention work?", vault="~/my-wiki")
print(answer["answer"])

# Save as markdown or Marp slideshow
ask("Compare RLHF vs DPO", vault="~/my-wiki", output_format="markdown")
ask("Overview of transformers", vault="~/my-wiki", output_format="marp")

# BM25 search
hits = search("attention mechanism", vault="~/my-wiki", limit=5)
for h in hits:
    print(h["title"], h["score"], h["snippet"])

# Rebuild search index
doc_count = reindex("~/my-wiki")

# Health check (writes report to _meta/lint-report.md)
lint("~/my-wiki")

# Vault stats
info = status("~/my-wiki")
print(info)
# {'vault_path': '...', 'raw_count': 12, 'pending_count': 2,
#  'wiki_count': 8, 'output_count': 3, 'has_claude': True}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mindloom-0.1.0.tar.gz (76.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mindloom-0.1.0-py3-none-any.whl (17.3 kB view details)

Uploaded Python 3

File details

Details for the file mindloom-0.1.0.tar.gz.

File metadata

  • Download URL: mindloom-0.1.0.tar.gz
  • Upload date:
  • Size: 76.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.16

File hashes

Hashes for mindloom-0.1.0.tar.gz
Algorithm Hash digest
SHA256 9fadb176796a0d7cbc9a0749ba0a32cef520583e61a4836dc09d5c802b732bb0
MD5 e6361fdb04f53523db073370f851c7e9
BLAKE2b-256 e62bc7e0c329e586a55cc6e0362ce87818d3f4cdae71786385746026459543fb

See more details on using hashes here.

File details

Details for the file mindloom-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: mindloom-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 17.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.16

File hashes

Hashes for mindloom-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d42ed430d7f5ad25b122646647508da9ab74c0f35ae2de3f30fef54ef9e0ef32
MD5 e9782c9464cd258965be38bd4610be0e
BLAKE2b-256 00ea5693a3fc6488937a955fb1e05724881425de3308ba88eaf2ba418bbb4548

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page