Skip to main content

Karpathy-style LLM wiki from your Claude Code, Codex CLI, Cursor, and Obsidian sessions

Project description

llmwiki

LLM-powered knowledge base from your Claude Code, Codex CLI, Cursor, Gemini CLI, and Obsidian sessions. Built on Andrej Karpathy's LLM Wiki pattern.

๐Ÿ‘‰ Live demo: pratiyush.github.io/llm-wiki

Rebuilt on every master push from the synthetic sessions in examples/demo-sessions/. No personal data. Shows every feature of the real tool (activity heatmap, tool charts, token usage, model info cards, vs-comparisons, project topics) running against safe reference data.

License: MIT Python 3.9+ Version Tests CI Link check Wiki checks Docker Works with Claude Code Works with Codex CLI Works with Copilot Works with Cursor Works with Gemini CLI Works with Obsidian


Every Claude Code, Codex CLI, Copilot, Cursor, and Gemini CLI session writes a full transcript to disk. You already have hundreds of them and never look at them again.

llmwiki turns that dormant history into a beautiful, searchable, interlinked knowledge base โ€” locally, in two commands. Plus, it produces AI-consumable exports (llms.txt, llms-full.txt, JSON-LD graph, per-page .txt + .json siblings) so other AI agents can query your wiki directly.

./setup.sh                         # one-time install
./build.sh && ./serve.sh           # build + serve at http://127.0.0.1:8765

Contributing in one line: read CONTRIBUTING.md, keep PRs focused (one concern each), use feat: / fix: / docs: / chore: / test: commit prefixes, never commit real session data (raw/ is gitignored), no new runtime deps. CI must be green to merge.

Screenshots

All screenshots below are from the public demo site which is built on every master push from the dummy example sessions. Your own wiki will look identical โ€” just with your real work.

Home โ€” projects overview with activity heatmap

llmwiki home page โ€” LLM Wiki header, activity heatmap, and a grid of three demo projects (demo-blog-engine, demo-ml-pipeline, demo-todo-api)

All sessions โ€” filterable table across every project

llmwiki sessions index โ€” activity timeline above a table of eight demo sessions with project, model, date, message count, and tool-call columns

Session detail โ€” full conversation + tool calls

llmwiki session detail โ€” Rust blog engine scaffolding session showing summary, breadcrumbs, a TOML Cargo.toml block and a Rust main.rs block, both highlighted by highlight.js

Changelog โ€” renders CHANGELOG.md as a first-class page

llmwiki changelog page โ€” keep-a-changelog format with colored headings for Added / Fixed / Changed and auto-linked PR references

Projects index โ€” freshness badges + per-project stats

llmwiki projects index โ€” three demo project cards with green/yellow/red freshness badges showing how recently each project was touched

What you get

Human-readable

  • All your sessions, converted from .jsonl to clean, redacted markdown
  • A Karpathy-style wiki โ€” sources/, entities/, concepts/, syntheses/, comparisons/, questions/ linked with [[wikilinks]]
  • A beautiful static site you can browse locally or deploy to GitHub Pages
    • Global search (Cmd+K command palette with fuzzy match over pre-built index)
    • highlight.js client-side syntax highlighting (light + dark themes)
    • Dark mode (system-aware + manual toggle with data-theme)
    • Keyboard shortcuts: / search ยท g h/p/s nav ยท j/k rows ยท ? help
    • Collapsible tool-result sections (auto-expand > 500 chars)
    • Copy-as-markdown + copy-code buttons
    • Breadcrumbs + reading progress bar
    • Filter bar on sessions table (project/model/date/text)
    • Reading time estimates (X min read)
    • Related pages panel at the bottom of every session
    • Activity heatmap on the home page
    • Model info cards with structured schema (provider, pricing, benchmarks)
    • Auto-generated vs-comparison pages between AI models
    • Append-only changelog timeline with pricing sparkline
    • Project topic chips (GitHub-style tags on project cards)
    • Agent labels (colored badges: Claude/Codex/Copilot/Cursor/Gemini)
    • Recently-updated card on the home page
    • Dataview-style structured queries in the command palette
    • Hover-to-preview wikilinks
    • Deep-link icons next to every heading
    • Mobile-responsive + print-friendly

AI-consumable (v0.4)

Every HTML page has sibling machine-readable files at the same URL:

  • <page>.html โ€” human HTML with schema.org microdata
  • <page>.txt โ€” plain text version (no HTML tags)
  • <page>.json โ€” structured metadata + body

Site-level AI-agent entry points:

File What
/llms.txt Short index per llmstxt.org spec
/llms-full.txt Flattened plain-text dump (~5 MB cap) โ€” paste into any LLM's context
/graph.jsonld Schema.org JSON-LD entity/concept/source graph
/sitemap.xml Standard sitemap with lastmod
/rss.xml RSS 2.0 feed of newest sessions
/robots.txt AI-friendly robots with llms.txt reference
/ai-readme.md AI-specific navigation instructions
/manifest.json Build manifest with SHA-256 hashes + perf budget

Every page also includes an <!-- llmwiki:metadata --> HTML comment that AI agents can parse without fetching the separate .json sibling.

Recipe โ€” query graph.jsonld from your terminal

The JSON-LD graph isn't just for crawlers โ€” you can ask quick questions about your wiki without leaving the shell. Example: print every session as a tree, grouped by project:

python3 examples/scripts/tree_from_graph.py

Output:

๐Ÿ“š 8 sessions across 3 projects
   (site/graph.jsonld v1.3.0)

llmwiki/
โ”œโ”€โ”€ demo-blog-engine/  (4 sessions)
โ”‚   โ”œโ”€โ”€ 2026-03-12  scaffolding-the-rust-blog-engine
โ”‚   โ”œโ”€โ”€ 2026-03-18  adding-syntax-highlighting
โ”‚   โ”œโ”€โ”€ 2026-03-25  rss-feed-and-sitemap
โ”‚   โ””โ”€โ”€ 2026-04-01  dark-mode-toggle
โ”œโ”€โ”€ demo-ml-pipeline/  (2 sessions)
โ”‚   โ”œโ”€โ”€ 2026-01-20  training-data-pipeline
โ”‚   โ””โ”€โ”€ 2026-02-02  model-training-loop
โ””โ”€โ”€ demo-todo-api/  (2 sessions)
    โ”œโ”€โ”€ 2026-02-08  fastapi-project-bootstrap
    โ””โ”€โ”€ 2026-02-15  adding-oauth-login

The full script is stdlib-only at examples/scripts/tree_from_graph.py. Same recipe pattern works for any aggregation question โ€” count sessions per model, find the largest project by token usage, list every entity that appears in 3+ sessions, etc. The graph is yours to slice.

Quality & governance (v1.0)

  • 4-factor confidence scoring โ€” source count, source quality, recency, cross-references; with Ebbinghaus-inspired decay per content-type
  • 5-state lifecycle machine โ€” draft โ†’ reviewed โ†’ verified โ†’ stale โ†’ archived with 90-day auto-stale
  • 16 lint rules โ€” 8 structural (frontmatter, link integrity, orphans, freshness, duplicates, index syncโ€ฆ) + 3 LLM-powered (contradictions, claim verification, summary accuracy) + stale_candidates (#51) + tags_topics_convention (#302) + stale_reference_detection (#303) + frontmatter_count_consistency (#378) + tools_consistency (#378)
  • Auto Dream โ€” MEMORY.md consolidation after 24h + 5 sessions: resolve relative dates, prune outdated, 200-line cap
  • 9 navigation files โ€” CLAUDE.md, AGENTS.md, MEMORY.md, SOUL.md, CRITICAL_FACTS.md, hints.md, hot.md + per-project hot caches

Obsidian-native experience (v1.0)

  • link-obsidian CLI โ€” symlinks the whole project into an Obsidian vault; graph view + backlinks + full-text search just work
  • Dataview dashboard โ€” 10 ready-to-use queries (recently updated, by confidence, by lifecycle, by project, by entity type, open questions, stale pages)
  • Templater templates โ€” 4 templates for source/entity/concept/synthesis pages, seeded with confidence + lifecycle + today's date
  • Category pages โ€” tag-based index pages in both Dataview (Obsidian) and static markdown (HTML) modes
  • Integration guide โ€” docs/obsidian-integration.md covers 6 recommended plugins with per-plugin configs

Automation

  • SessionStart hook โ€” auto-syncs new sessions in the background on every Claude Code launch
  • Auto-build on sync โ€” /wiki-sync triggers /wiki-build (configurable; default on)
  • One-shot pipeline โ€” llmwiki all runs build โ†’ graph โ†’ export โ†’ lint in a single command (--strict for CI)
  • MCP server โ€” 12 production tools (query, search, list, read, lint, sync, export, + confidence, lifecycle, dashboard, entity search, category browse) queryable from any MCP client (Claude Desktop, Cline, Cursor, ChatGPT desktop)
  • Pending ingest queue โ€” SessionStart hook converts + queues; /wiki-sync processes queue
  • No servers, no database, no npm โ€” Python stdlib + markdown. Syntax highlighting loads from a highlight.js CDN at view time.

How it works

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚  ~/.claude/projects/*/*.jsonl       โ”‚  โ† Claude Code sessions
โ”‚  ~/.codex/sessions/**/*.jsonl       โ”‚  โ† Codex CLI sessions
โ”‚  ~/Library/.../Cursor/workspaceSโ€ฆ   โ”‚  โ† Cursor
โ”‚  ~/Documents/Obsidian Vault/        โ”‚  โ† Obsidian
โ”‚  ~/.gemini/                         โ”‚  โ† Gemini CLI
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
               โ”‚
               โ–ผ   python3 -m llmwiki sync
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚  raw/sessions/<project>/            โ”‚  โ† immutable markdown (Karpathy layer 1)
โ”‚     2026-04-08-<slug>.md            โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
               โ”‚
               โ–ผ   /wiki-ingest  (your coding agent)
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚  wiki/sources/<slug>.md             โ”‚  โ† LLM-generated wiki (Karpathy layer 2)
โ”‚  wiki/entities/<Name>.md            โ”‚
โ”‚  wiki/concepts/<Name>.md            โ”‚
โ”‚  wiki/syntheses/<Name>.md           โ”‚
โ”‚  wiki/comparisons/<Name>.md         โ”‚
โ”‚  wiki/questions/<Name>.md           โ”‚
โ”‚  wiki/index.md, overview.md, log.md โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
               โ”‚
               โ–ผ   python3 -m llmwiki build
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚  site/                              โ”‚  โ† static HTML + AI exports
โ”‚  โ”œโ”€โ”€ index.html, style.css, ...     โ”‚
โ”‚  โ”œโ”€โ”€ sessions/<project>/<slug>.html โ”‚
โ”‚  โ”œโ”€โ”€ sessions/<project>/<slug>.txt  โ”‚  (AI sibling)
โ”‚  โ”œโ”€โ”€ sessions/<project>/<slug>.json โ”‚  (AI sibling)
โ”‚  โ”œโ”€โ”€ llms.txt, llms-full.txt        โ”‚
โ”‚  โ”œโ”€โ”€ graph.jsonld                   โ”‚
โ”‚  โ”œโ”€โ”€ sitemap.xml, rss.xml           โ”‚
โ”‚  โ”œโ”€โ”€ robots.txt, ai-readme.md       โ”‚
โ”‚  โ”œโ”€โ”€ manifest.json                  โ”‚
โ”‚  โ””โ”€โ”€ search-index.json              โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

See docs/architecture.md for the full 3-layer Karpathy + 8-layer build breakdown.

Documentation

Full production documentation lives under docs/. The editorial hub is docs/index.md โ€” tutorials, per-agent guides, reference, and deployment, all in one place.

Start here:

Goal Read
Install and build your first site in 10 minutes Tutorial 01 โ†’ 02
Use llmwiki with Claude Code Tutorial 03
Use llmwiki with Codex CLI Tutorial 04
Query / lint / review your wiki daily Tutorial 05
Point llmwiki at an existing Obsidian / Logseq vault Tutorial 06
See four real end-to-end workflows Tutorial 07

Contributing to docs? See the style guide.

Install

macOS / Linux

git clone https://github.com/Pratiyush/llm-wiki.git
cd llm-wiki
./setup.sh

Windows

git clone https://github.com/Pratiyush/llm-wiki.git
cd llm-wiki
setup.bat

With pip (v0.3+)

pip install -e .                # basic โ€” everything you need
pip install -e '.[pdf]'         # + PDF ingestion
pip install -e '.[dev]'         # + pytest + ruff
pip install -e '.[all]'         # all of the above

Syntax highlighting is now powered by highlight.js, loaded from a CDN at view time โ€” no optional deps required.

What setup does

  1. Creates raw/, wiki/, site/ data directories
  2. Installs the llmwiki Python package in-place
  3. Detects your coding agents and enables matching adapters
  4. Optionally offers to install the SessionStart hook into ~/.claude/settings.json for auto-sync
  5. Runs a first sync so you see output immediately

For maintainers

Running the project? The governance scaffold lives under docs/maintainers/ and is loaded by a dedicated skill:

File What it's for
CONTRIBUTING.md Short rules for contributors โ€” read this first
CODE_OF_CONDUCT.md Contributor Covenant 2.1
SECURITY.md Disclosure process for redaction bugs, XSS, data leaks
docs/maintainers/ARCHITECTURE.md One-page system diagram + layer boundaries + what NOT to add
docs/maintainers/REVIEW_CHECKLIST.md Canonical code-review criteria
docs/maintainers/RELEASE_PROCESS.md Version bump โ†’ CHANGELOG โ†’ tag โ†’ build โ†’ publish
docs/maintainers/TRIAGE.md Label taxonomy + stale-issue policy
docs/maintainers/ROADMAP.md Near-term plan + release themes
docs/maintainers/DECLINED.md Graveyard of declined ideas with reasons

Four Claude Code slash commands automate the common ops:

  • /review-pr <N> โ€” apply the REVIEW_CHECKLIST to a PR and post findings
  • /triage-issue <N> โ€” label + milestone + priority a new issue
  • /release <version> โ€” walk the release process step by step
  • /maintainer โ€” meta-skill that loads every governance doc as context

Running E2E tests

The unit suite (pytest tests/ โ€” 472 tests) runs in milliseconds and covers every module. The end-to-end suite under tests/e2e/ is separate: it builds a minimal demo site, serves it on a random port, drives a real browser via Playwright, and runs scenarios written in Gherkin via pytest-bdd.

Why both? Unit tests lock the contract at the module boundary; E2E locks the contract at the user's browser. A diff that passes unit tests but breaks the Cmd+K palette will fail E2E.

Install the extras (one-time, ~300 MB for Chromium):

pip install -e '.[e2e]'
python -m playwright install chromium

Run the suite:

pytest tests/e2e/ --browser=chromium

Run a single feature:

pytest tests/e2e/test_command_palette.py --browser=chromium -v

The E2E suite is excluded from the default pytest tests/ run (see the --ignore=tests/e2e addopt in pyproject.toml) so you can iterate on the unit suite without waiting for browser installs. CI runs the E2E job as a separate workflow (.github/workflows/e2e.yml) that only fires on PRs touching build.py, the viz modules, or tests/e2e/**.

Feature files live under tests/e2e/features/ โ€” one per UI area (homepage, session page, command palette, keyboard nav, mobile nav, theme toggle, copy-as-markdown, responsive breakpoints, edge cases, accessibility, visual regression). Step definitions are all in tests/e2e/steps/ui_steps.py. Adding a new scenario is usually a 2-line change to a .feature file plus maybe one new step.

Run locally with an HTML report:

pytest tests/e2e/ --browser=chromium \
  --html=e2e-report/index.html --self-contained-html
open e2e-report/index.html     # macOS โ€” opens the browseable report

Where to see test reports:

What Where
Unit test results GitHub Actions โ†’ ci.yml โ†’ latest run โ†’ lint-and-test job logs
E2E HTML report GitHub Actions โ†’ e2e.yml โ†’ latest run โ†’ Artifacts โ†’ e2e-html-report (14-day retention)
Visual regression screenshots Same run โ†’ Artifacts โ†’ e2e-screenshots
Playwright traces (failed runs only) Same run โ†’ Artifacts โ†’ playwright-traces (open with playwright show-trace <zip>)
Demo site deploy status GitHub Actions โ†’ pages.yml โ†’ latest run

Locally, the HTML report is one file (e2e-report/index.html) that you can open in any browser โ€” pass/fail per scenario, duration, stdout/stderr, screenshot on failure.

Scheduled sync

For a daily / weekly cron-style sync, schedule llmwiki sync directly via your OS's native job runner (launchd on macOS, systemd on Linux, Task Scheduler on Windows). Paths and adapter selection come from examples/sessions_config.json.

CLI reference

llmwiki init                    # scaffold raw/ wiki/ site/ + seed nav files
llmwiki sync                    # convert .jsonl โ†’ markdown (auto-build + auto-lint if configured)
llmwiki build                   # compile static HTML + AI exports
llmwiki serve                   # local HTTP server on 127.0.0.1:8765
llmwiki adapters                # list available adapters + configured state (v1.0)
llmwiki graph                   # build knowledge graph (v0.2)
llmwiki lint                    # 16-rule wiki lint (v1.2)
llmwiki export <format>         # AI-consumable exports (v0.4)
llmwiki synthesize              # auto-ingest synthesis pipeline (v0.5)
llmwiki all                     # build โ†’ graph โ†’ export โ†’ lint in one shot (v1.2)
llmwiki version

Each subcommand has its own --help. All commands are also wrapped in one-click shell/batch scripts: sync.sh/.bat, build.sh/.bat, serve.sh/.bat, upgrade.sh/.bat.

Works with

Agent Adapter Status Added in
Claude Code llmwiki.adapters.claude_code โœ… Production v0.1
Obsidian (input) llmwiki.adapters.obsidian โœ… Production v0.1
Obsidian (output) llmwiki.obsidian_output โœ… Production v0.2
Codex CLI llmwiki.adapters.codex_cli โœ… Production v0.3
Cursor llmwiki.adapters.cursor โœ… Production v0.5
Gemini CLI llmwiki.adapters.gemini_cli โœ… Production v0.5
Copilot Chat llmwiki.adapters.copilot_chat โœ… Production v0.9
Copilot CLI llmwiki.adapters.copilot_cli โœ… Production v0.9
OpenCode / OpenClaw โ€” โธ Deferred โ€”

Adding a new agent is one small file โ€” subclass BaseAdapter, declare SUPPORTED_SCHEMA_VERSIONS, ship a fixture + snapshot test.

MCP server

llmwiki ships its own MCP server (stdio transport, no SDK dependency) so any MCP client can query your wiki directly.

python3 -m llmwiki.mcp   # runs on stdin/stdout

Twelve production tools (7 core + 5 added in v1.0 #159):

Tool What
wiki_query(question, max_pages) Keyword search + page content (no LLM synthesis)
wiki_search(term, include_raw) Raw grep over wiki/ (+ optional raw/)
wiki_list_sources(project) List raw source files with metadata
wiki_read_page(path) Read one page (path-traversal guarded)
wiki_lint() Orphans + broken-wikilinks report
wiki_sync(dry_run) Trigger the converter
wiki_export(format) Return any AI-consumable export (llms.txt, jsonld, sitemap, rss, manifest)
wiki_confidence(min, max) Pages by confidence range (v1.0)
wiki_lifecycle(state) Pages by draft/reviewed/verified/stale/archived (v1.0)
wiki_dashboard() Health summary: counts by type, lifecycle, confidence (v1.0)
wiki_entity_search(name, entity_type) Search entities by name substring or type (v1.0)
wiki_category_browse(tag) Browse tags with counts, drill into specific tag (v1.0)

Register in your MCP client's config โ€” e.g. for Claude Desktop, add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "llmwiki": {
      "command": "python3",
      "args": ["-m", "llmwiki.mcp"]
    }
  }
}

Configuration

Single JSON config at examples/sessions_config.json. Copy to config.json and edit:

{
  "filters": {
    "live_session_minutes": 60,
    "exclude_projects": []
  },
  "redaction": {
    "real_username": "YOUR_USERNAME",
    "replacement_username": "USER",
    "extra_patterns": [
      "(?i)(api[_-]?key|secret|token|bearer|password)...",
      "sk-[A-Za-z0-9]{20,}"
    ]
  },
  "truncation": {
    "tool_result_chars": 500,
    "bash_stdout_lines": 5
  },
  "adapters": {
    "obsidian": {
      "vault_paths": ["~/Documents/Obsidian Vault"]
    }
  }
}

All paths, regexes, truncation limits, and per-adapter settings are tunable. See docs/configuration.md.

.llmwikiignore

Gitignore-style pattern file at the repo root. Skip entire projects, dates, or specific sessions without touching config:

# Skip a whole project
confidential-client/
# Skip anything before a date
*2025-*
# Keep exception
!confidential-client/public-*

Karpathy's LLM Wiki pattern

This project follows the three-layer structure described in Karpathy's gist:

  1. Raw sources (raw/) โ€” immutable. Session transcripts converted from .jsonl.
  2. The wiki (wiki/) โ€” LLM-generated. One page per entity, concept, source. Interlinked via [[wikilinks]].
  3. The schema (CLAUDE.md, AGENTS.md) โ€” tells your agent how to ingest and query.

See docs/architecture.md for the full breakdown and how it maps to the file tree.

Design principles

  • Stdlib first โ€” only mandatory runtime dep is markdown. pypdf is an optional extra for PDF ingestion.
  • Works offline โ€” no Google fonts, no external CSS. Syntax highlighting loads from a highlight.js CDN but degrades gracefully without it.
  • Redact by default โ€” username, API keys, tokens, emails all get redacted before entering the wiki.
  • Idempotent everything โ€” re-running any command is safe and cheap.
  • Agent-agnostic core โ€” the converter doesn't know which agent produced the .jsonl; adapters translate.
  • Privacy by default โ€” localhost-only binding, no telemetry, no cloud calls.
  • Dual-format output (v0.4) โ€” every page ships both for humans (HTML) and AI agents (TXT + JSON + JSON-LD + sitemap + llms.txt).

Docs

  • Getting started โ€” 5-minute quickstart
  • Setup guide โ€” 15-minute end-to-end tutorial: local setup โ†’ deploy to GitHub Pages โ†’ customization (v1.0)
  • Obsidian integration โ€” 5-minute setup, 6 recommended plugins, config recipes (v1.0)
  • Architecture โ€” Karpathy 3-layer + 8-layer build breakdown
  • Configuration โ€” every tuning knob
  • Privacy โ€” redaction rules + .llmwikiignore + localhost binding
  • Windows setup โ€” Windows-specific gotchas
  • Framework โ€” Open Source Framework v4.1 adapted for agent-native dev tools
  • Research โ€” Phase 1.25 analysis of 15 prior LLM Wiki implementations
  • Feature matrix โ€” all 161 features across 16 categories
  • Roadmap โ€” Phase ร— Layer ร— Item MoSCoW table
  • Translations: i18n/zh-CN, i18n/ja, i18n/es

Per-adapter docs:

Releases

Version Focus Tag
v0.1.0 Core release โ€” Claude Code adapter, god-level HTML UI, schema, CI, plugin scaffolding v0.1.0
v0.2.0 Extensions โ€” 3 new slash commands, 3 new adapters, Obsidian bidirectional, full MCP server v0.2.0
v0.3.0 PyPI packaging, eval framework, i18n scaffold v0.3.0
v0.4.0 AI + human dual format โ€” per-page .txt/.json siblings, llms.txt, JSON-LD graph, sitemap, RSS, schema.org microdata, reading time, related pages, activity heatmap, deep-link anchors, build manifest, link checker, wiki_export MCP tool v0.4.0
v0.5.0 โ€“ v0.9.0 Internal sprint milestones โ€” features (_context.md, auto-ingest, qmd export, model-profile schema, activity heatmap, Copilot adapters, etc.) shipped consolidated under the v0.9.x line. No standalone tags were published. โ€”
v0.9.1 Sprint 1 & 2 foundation โ€” link-obsidian CLI, 4-factor confidence scoring, 5-state lifecycle machine, llmbook-reference skill, 7 entity types, flat raw/ naming, pending ingest queue, _context.md stubs, meeting + Jira adapters, configurable Web Clipper intake, rich log format v0.9.1
v0.9.2 Sprint 3 quality โ€” 11 lint rules (8 basic + 3 LLM-powered), Auto Dream MEMORY.md consolidation, Dataview dashboard template, category pages (Dataview + static), auto-build on sync + configurable lint schedule v0.9.2
v0.9.3 Sprint 3 polish โ€” Obsidian Templater templates, integration guide, two-way editing tests, MCP server 7โ†’12 tools, adapter config validation, pipeline fix (sigstore, PyPI gate) v0.9.3
v0.9.4 Session C1 (Sprint 4) โ€” multi-agent skill installer, enhanced search with facets, configurable scheduled sync (launchd/systemd/Task Scheduler), CI wiki-checks workflow v0.9.4
v0.9.5 Docs polish + consistency audit before v1.0.0 v0.9.5
v1.0.0 Production-ready Obsidian integration โ€” full v1.0 scope v1.0.0
v1.1.0-rc1 Solo quick-win sprint โ€” candidates workflow, Ollama scaffold, prompt-cache scaffold v1.1.0-rc1
v1.1.0-rc2 Session E โ€” interactive graph viewer + remaining code-only v1.1 work v1.1.0-rc2
v1.1.0-rc3 Gap-sweep bundle โ€” state portability, quarantine, sync --status, log CLI, synthesize --estimate breakdown, tag family, stale references, graph context menu, raw immutability, AI-sessions default v1.1.0-rc3
v1.1.0-rc4 Navigation + quality โ€” graph site_url resolver (99.7% โ†’ 0% dead clicks), llmwiki backlinks CLI (95% โ†’ 0% orphan pages), source-code โ†’ GitHub link rewriter (471 โ†’ 100 broken), verify-before-fixing contribution rule v1.1.0-rc4
v1.1.0-rc5 Site audit + 5 closed batches โ€” session-local ref stripping (351 โ†’ 247 broken), cheatsheet, README/CONTRIBUTING compile, expanded E2E, slash-CLI parity test, 4 adapter docs, Ollama tutorial, dual-mode docs skeleton, /wiki-synthesize slash v1.1.0-rc5
v1.1.0-rc6 rc6 batch โ€” fixed adapter tag hardcoded to claude-code for every adapter (#346), tutorial UX polish with in-page TOC + prev/next + edit-on-GitHub (#282), command palette now indexes 107 doc pages + 17 slash commands (#277), content-hash cache for md_to_html (#283) v1.1.0-rc6
v1.1.0-rc7 rc7 batch โ€” automatic AI-suggested tags during synthesis (#351), link-checker config fix (#348, #350, #353) v1.1.0-rc7
v1.1.0-rc8 rc8 batch โ€” complete Mode B agent-delegate backend (#316): new llmwiki synthesize --list-pending + --complete <uuid> CLI subcommands, /wiki-sync step 6 auto-detects pending prompts, Mode B ships end-to-end without an API key v1.1.0-rc8
v1.2.0 First stable on the 1.x line โ€” llmwiki all one-shot pipeline runner, Playwright + axe-core E2E suite (#384), project-stub auto-seeding, 2 new lint rules, critical export-fidelity + sync-collision fixes, 10 UX-critique items (#387). PyPI distribution name: llm-notebook. v1.2.0

Roadmap

Shipped milestones:

  • v0.5.0 โ€” Folder-level _context.md, auto-ingest, adapter graduations, lazy search index, scheduled sync, WCAG, E2E tests (milestone)
  • v0.6.0 โ€” qmd export, GitLab Pages CI, PyPI release automation, maintainer governance scaffold (milestone)
  • v0.7.0 โ€” Structured model-profile schema, vs-comparison pages, append-only changelog timeline (milestone)
  • v0.8.0 โ€” 365-day activity heatmap, tool-calling bar chart, token usage card, session metrics frontmatter (milestone)
  • v0.9.0 โ€” Project topics, agent labels, Copilot adapters, image pipeline, highlight.js, public demo deployment
  • v0.9.x โ€” Sprint 1-4 foundation for v1.0.0 Obsidian integration: confidence scoring, lifecycle state machine, 9 navigation files, 11 lint rules, Auto Dream, Dataview dashboard, multi-agent skills, 12-tool MCP server, meeting + Jira adapters

Active milestones:

Milestone Focus Tracking
v1.0.0 Final docs polish + PyPI trusted publisher + release Milestone
v1.1.0 Ollama backend, prompt caching, interactive graph viewer, Homebrew tap Milestone
v1.2.0 ChatGPT + OpenCode adapters, vault-overlay mode, tree-aware search, cache tiers Milestone

Deployment targets

Acknowledgements

License

MIT ยฉ Pratiyush

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_notebook-1.3.24.tar.gz (519.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_notebook-1.3.24-py3-none-any.whl (308.4 kB view details)

Uploaded Python 3

File details

Details for the file llm_notebook-1.3.24.tar.gz.

File metadata

  • Download URL: llm_notebook-1.3.24.tar.gz
  • Upload date:
  • Size: 519.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for llm_notebook-1.3.24.tar.gz
Algorithm Hash digest
SHA256 9b93812607d8456a752d66b7ec1f02b5096b70fa7022aa318952f479f0bab927
MD5 2c2d853931129100ea74779824dfc926
BLAKE2b-256 419abaf8144660fa92ed43433152e6abd15dd854ee565c12626cc47ec632f6bd

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_notebook-1.3.24.tar.gz:

Publisher: release.yml on Pratiyush/llm-wiki

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_notebook-1.3.24-py3-none-any.whl.

File metadata

  • Download URL: llm_notebook-1.3.24-py3-none-any.whl
  • Upload date:
  • Size: 308.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for llm_notebook-1.3.24-py3-none-any.whl
Algorithm Hash digest
SHA256 03ea11bfe14340d1b2877acba59ee7ae5e42b3d40efed3be3d3faec8ed7da09c
MD5 72a572cba0c71a3ec4bbea7894fbb344
BLAKE2b-256 d6bdf243f50e5648d64edd9e8d693c6401f39bcd21a8127a085b51d522fd4392

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_notebook-1.3.24-py3-none-any.whl:

Publisher: release.yml on Pratiyush/llm-wiki

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page