Skip to main content

Local-first single-user knowledge engine — accumulates gathered knowledge and exposes it via MCP to connected agents like Claude Code.

Project description

Alexandria

Local-first single-user knowledge engine. Accumulates your gathered knowledge (raw sources, compiled wiki pages, event streams, AI conversations) and exposes it via MCP to connected agents like Claude Code for retroactive query, retrieval, and synthesis.

Alexandria is not a chat client. Interactive conversations happen in your existing MCP-capable agent (Claude Code, Cursor, Codex). Alexandria is the knowledge engine those agents connect to.

Install

pip install alexandria-wiki          # core
pip install "alexandria-wiki[pdf]"   # + PDF support
pip install "alexandria-wiki[all]"   # + PDF + YouTube transcripts

Or with uv:

uvx alexandria-wiki
# or
uv tool install alexandria-wiki

Quick Start

# Initialize
alxia init
alxia status

# Create a project workspace
alxia project create my-research --description "ML papers"
alxia workspace use my-research

# Ingest from anywhere
alxia ingest ~/Documents/paper.pdf
alxia ingest https://arxiv.org/pdf/2401.12345.pdf
alxia ingest https://en.wikipedia.org/wiki/Transformer_(deep_learning_architecture)

# Query your knowledge
alxia query "attention mechanisms"

# Connect to Claude Code
alxia mcp install claude-code

Sources

Alexandria ingests from 14 source types:

Source Command
Local file alxia ingest ~/file.md
PDF alxia ingest ~/paper.pdf
URL (HTML) alxia ingest https://example.com/article
URL (PDF) alxia ingest https://arxiv.org/pdf/2401.12345.pdf
Git repo alxia source add git-local --name repo --repo-url ~/project
GitHub alxia source add github --name repo --owner x --repo y
RSS/Atom alxia source add rss --name blog --feed-url https://example.com/feed
YouTube alxia source add youtube --name talks --urls "https://youtu.be/abc"
Notion alxia source add notion --name wiki --token-ref key --page-ids "abc"
HuggingFace alxia source add huggingface --name models --repos "meta-llama/Llama-3-8b"
Obsidian/Folder alxia source add folder --name vault --path ~/ObsidianVault
Zip/Tar alxia source add archive --name papers --path ~/papers.zip
IMAP alxia source add imap --name mail --imap-host imap.gmail.com --imap-user me
Clipboard alxia paste --title "note" --content "quick thought"

After adding sources, pull everything with:

alxia sync

MCP Integration

Alexandria exposes your knowledge to AI agents via the Model Context Protocol.

# Register with Claude Code (stdio transport)
alxia mcp install claude-code

# Or run the HTTP server for other clients
alxia mcp serve-http --port 7219

Available MCP tools: guide, overview, list, grep, search, read, follow, history, why, events, timeline, git_log, git_show, git_blame, sources, subscriptions.

CLI Reference

alxia init                    Initialize ~/.alexandria/
alxia status                  Operational dashboard
alxia doctor                  Health checks

alxia ingest <file-or-url>    Ingest a source (file, PDF, URL)
alxia query <question>        Search across all knowledge
alxia why <topic>             Belief explainability + provenance
alxia lint                    Find wiki rot (stale citations)
alxia synthesize              Generate temporal digest

alxia source add <type>       Add a source adapter
alxia source list             List configured sources
alxia sync                    Pull from all sources
alxia subscriptions poll      Poll RSS + IMAP
alxia subscriptions list      Show pending items

alxia workspace use <slug>    Switch workspace
alxia project create <name>   Create a project workspace

alxia secrets set <ref>       Store an encrypted secret
alxia hooks install <client>  Install capture hooks
alxia capture conversation    Capture an agent session

alxia eval run                Run quality metrics (M1-M5)
alxia daemon start            Start background scheduler
alxia logs show               View structured logs

Architecture

  • SQLite + filesystem hybrid: filesystem is source of truth for documents, SQLite for search/metadata/events
  • FTS5 for keyword search (no vectors, no RAG — the agent IS the retriever)
  • Hostile verifier: every wiki write is verified before commit (citations, quote anchors, cascade policy)
  • Belief revision: structured claims with supersession chains and provenance
  • AES-256-GCM vault: encrypted secrets with PBKDF2 key derivation
  • Structured JSONL logging with run_id correlation
  • 8 schema migrations applied automatically on init

See docs/architecture/ for the 20 architecture documents.

Docker

docker build -t alexandria .
docker run -v ~/.alexandria:/data alexandria init
docker run -v ~/.alexandria:/data alxia status

Development

git clone git@github.com:epappas/alexandria.git
cd alexandria
uv sync --dev
uv run pytest tests/       # 352 tests
./scripts/build.sh          # test + build
./scripts/publish.sh        # test + build + PyPI + git tag

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alexandria_wiki-0.22.0.tar.gz (572.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alexandria_wiki-0.22.0-py3-none-any.whl (221.2 kB view details)

Uploaded Python 3

File details

Details for the file alexandria_wiki-0.22.0.tar.gz.

File metadata

  • Download URL: alexandria_wiki-0.22.0.tar.gz
  • Upload date:
  • Size: 572.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for alexandria_wiki-0.22.0.tar.gz
Algorithm Hash digest
SHA256 ce4cf8f6d0e052865fbf9dfca4d288733056424fd476102fb31fcf4eaba420fc
MD5 d26d37a93afdddadc478d363c1155c87
BLAKE2b-256 741deb91f9a3b8e4861dcb2e565fa3d936676755859c7a8b791fb423d05c63e3

See more details on using hashes here.

File details

Details for the file alexandria_wiki-0.22.0-py3-none-any.whl.

File metadata

File hashes

Hashes for alexandria_wiki-0.22.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ca1f663f096e3876c75398e4755c106683e7fcabd6c540caed77903a1aa03528
MD5 dc10fa8815b6af5273277452c3976912
BLAKE2b-256 70153ebe49d031dd4ee6707bacd621f19941ba0d66621ea61ba4e25461704b6a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page