Know it now. Keep it forever. Local-first personal knowledge base compiled by LLMs.
Project description
WikiNow
Know it now. Keep it forever.
A local-first personal knowledge base that you feed sources into, an LLM compiles into a structured wiki, and you query through any AI tool via MCP.
Inspired by Andrej Karpathy's LLM Wiki pattern.
You collect stuff → LLM organizes it → You query it → Knowledge compounds
How It Works
Instead of RAG (re-derive knowledge every query), the LLM incrementally builds and maintains a persistent wiki. When you add a source, it reads it, extracts key information, and integrates it into the existing wiki — updating entity pages, revising topic summaries, noting contradictions, strengthening the evolving synthesis.
The wiki is a persistent, compounding artifact. A single ingest can touch 10-15 wiki pages. You never write the wiki yourself — the LLM does all the bookkeeping.
┌─────────────────────────────────────────────────┐
│ You (human) │ LLM (via MCP) │
├────────────────────────┼────────────────────────┤
│ Curate sources │ Summarize │
│ Ask good questions │ Cross-reference │
│ Think about meaning │ Flag contradictions │
│ │ Maintain consistency │
│ │ Update 15 files/ingest│
└────────────────────────┴────────────────────────┘
Features
- One-command capture —
wn ingest <url>fetches any URL, YouTube video, PDF, epub, or audio file - Ripple effect — one source touches 10-15 wiki pages (source summary + concepts + index + overview + tags + log)
- AI-native query — host AI reads index.md, finds relevant pages, synthesizes answers with citations
- Compounding queries — valuable answers filed back into the wiki as new pages
- Self-healing database — SQLite FTS5 cache auto-syncs with .md files, no manual rebuild
- Obsidian-compatible — wikilinks, frontmatter, graph view, backlinks work out of the box
- Schema co-evolution — CLAUDE.md instructions evolve through conversation
- Multi-project — separate knowledge bases for different topics
- 21 MCP tools — full toolkit for the host AI to manage the wiki
- Professional CLI — rich terminal UI with panels, colors, health bars
Installation
# pip
pip install wikinow
# uv
uv tool install wikinow
Optional Dependencies
# Install with specific extras
pip install wikinow[ollama] # Ollama web search
pip install wikinow[pdf] # PDF extraction
pip install wikinow[youtube] # YouTube transcripts
pip install wikinow[epub] # Epub parsing
pip install wikinow[whisper] # Audio transcription (Whisper)
pip install wikinow[watch] # Auto-ingest on file drop (watchdog)
# Install everything
pip install wikinow[all]
# or
uv tool install wikinow[all]
Quick Start
# Create a project
wn init my-research
# Start the MCP server
wn serve
# Connect to your AI tool (pick one)
claude mcp add wikinow -- wn serve # Claude Code
codex mcp add wikinow -- wn serve # Codex
For Cursor / VS Code, add to .vscode/mcp.json:
{
"mcpServers": {
"wikinow": {
"command": "wn",
"args": ["serve"]
}
}
}
Then tell the AI: "Ingest this URL and compile it into the wiki" — and watch it work.
CLI Commands
wn init <name> Create a new project
wn use <name> Switch active project
wn list List all projects
wn serve Start MCP server
wn ingest <url|file> Ingest a URL or local file
wn search "query" Search the wiki (FTS5)
wn read <article> Read a wiki article
wn stats Project statistics
wn lint Health check
wn gaps Knowledge gaps
wn config Show configuration
wn config <key> <value> Update configuration
wn export Export as single markdown file
wn --version Show version
MCP Tools
WikiNow exposes 21 tools to the host AI:
| Category | Tools |
|---|---|
| Ingest | ingest_url, ingest_text, ingest_file |
| Read/Write | read, write, index_article, index_raw, mark_compiled |
| Search | search, search_web |
| List/Stats | list_all_articles, list_all_raw, list_all_tags, get_project_stats, get_all_contradictions, get_gaps |
| Maintenance | lint, append_log, update_schema, re_ingest, export |
Source Types
| Source | How it works |
|---|---|
| Web URLs | Jina Reader — free, no API key, handles JavaScript |
| YouTube | yt-dlp subtitles, Whisper fallback for audio |
| PDFs (web) | Jina Reader handles PDF URLs natively |
| PDFs (local) | pymupdf extraction |
| Epub books | ebooklib + BeautifulSoup |
| Audio/video | Whisper turbo model (local, free) |
| Text/Markdown | Direct read |
Note: WikiNow only supports English content. Non-English audio is automatically detected and skipped. YouTube subtitles are fetched in English only.
Project Structure
Each project lives in ~/.wikinow/<name>/:
~/.wikinow/my-research/
├── raw/ ← immutable sources (never modified)
├── wiki/
│ ├── index.md ← master catalog
│ ├── overview.md ← evolving synthesis
│ ├── log.md ← append-only history
│ ├── contradictions.md ← conflict tracker
│ ├── gaps.md ← open questions
│ ├── tags.md ← tag index
│ ├── sources/ ← one page per raw source
│ ├── concepts/ ← concept and entity pages
│ ├── comparisons/ ← X vs Y analysis
│ └── queries/ ← filed query answers
├── images/ ← downloaded images
├── CLAUDE.md ← schema (Claude Code + Cursor)
├── AGENTS.md ← symlink (Codex + Copilot)
├── .obsidian/ ← pre-configured vault
└── wikinow.db ← self-healing FTS5 index
Configuration
# ~/.wikinow/config.yaml
projects:
active: my-research
ollama:
api_key: "" # OLLAMA_API_KEY env var — for web search
whisper:
model: turbo # Whisper model for audio transcription
ingestion:
jina_api_key: "" # Optional — 20 RPM free, 500 RPM with key
auto_compile: true
auto_watch: false
search:
max_results: 10
Obsidian Integration
wn init creates a pre-configured .obsidian/ vault:
- Wikilinks enabled —
[[page]]links work natively - Graph view — see how pages connect
- Backlinks — see what links to each page
- Cmd+Shift+D — download remote images locally
- Dataview compatible — YAML frontmatter on every article
Open ~/.wikinow/<project>/ in Obsidian and browse your wiki in real time.
The Karpathy Pattern
WikiNow implements the LLM Wiki pattern by Andrej Karpathy:
"The tedious part of maintaining a knowledge base is not the reading or the thinking — it's the bookkeeping. LLMs don't get bored, don't forget to update a cross-reference, and can touch 15 files in one pass."
Three layers:
- Raw sources — immutable, curated by you
- The wiki — compiled and maintained by the LLM
- The schema — co-evolved by you and the LLM over time
Three operations:
- Ingest — add a source, wiki ripples with updates
- Query — ask questions, answers compound back into wiki
- Lint — health check, find contradictions, suggest gaps to fill
Requirements
- Python >= 3.11
- ffmpeg — required if using audio transcription or YouTube Whisper fallback
Installing ffmpeg
# macOS
brew install ffmpeg
# Ubuntu / Debian
sudo apt install ffmpeg
# Fedora
sudo dnf install ffmpeg
Testing
# Install dev dependencies
uv sync --group dev
# Run all tests
uv run pytest tests/ -v
# Run a single test file
uv run pytest tests/test_server.py -v
License
MIT
Acknowledgments
This project implements the LLM Wiki pattern by Andrej Karpathy. The core architecture — three layers (raw, wiki, schema), three operations (ingest, query, lint), and the philosophy that humans curate while LLMs maintain — comes directly from his work.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file wikinow-0.1.0.tar.gz.
File metadata
- Download URL: wikinow-0.1.0.tar.gz
- Upload date:
- Size: 178.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.25 {"installer":{"name":"uv","version":"0.9.25","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ca7a961ff026336fc38adb50901c6d3767acdf61404906de14a428da198dad91
|
|
| MD5 |
91d7316c45f66cd60763cb85e6590a2f
|
|
| BLAKE2b-256 |
10cf8b642890e0874caef1ad3dd6aa5c7219e4c26597a848bc1faba93fd39ea7
|
File details
Details for the file wikinow-0.1.0-py3-none-any.whl.
File metadata
- Download URL: wikinow-0.1.0-py3-none-any.whl
- Upload date:
- Size: 34.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.25 {"installer":{"name":"uv","version":"0.9.25","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
48eb29f45c0f851cd76f6d0b3fb438330af799b2247fe97e1ca467180b281e06
|
|
| MD5 |
d92532cb7f7e8faf178f748254fafc89
|
|
| BLAKE2b-256 |
3b213233549b002a8a6e866b6c792872d0604d88e8990db486e6cada6371fea5
|