Skip to main content

Local-first single-user knowledge engine — accumulates gathered knowledge and exposes it via MCP to connected agents like Claude Code.

Project description

Alexandria

Local-first single-user knowledge engine. Accumulates your gathered knowledge (raw sources, compiled wiki pages, event streams, AI conversations) and exposes it via MCP to connected agents like Claude Code for retroactive query, retrieval, and synthesis.

Alexandria is not a chat client. Interactive conversations happen in your existing MCP-capable agent (Claude Code, Cursor, Codex). Alexandria is the knowledge engine those agents connect to.

Install

pip install alexandria-wiki          # core
pip install "alexandria-wiki[pdf]"   # + PDF support
pip install "alexandria-wiki[all]"   # + PDF + YouTube transcripts

Or with uv:

uvx alexandria-wiki
# or
uv tool install alexandria-wiki

Quick Start

# Initialize
alxia init
alxia status

# Create a project workspace
alxia project create my-research --description "ML papers"
alxia workspace use my-research

# Ingest from anywhere
alxia ingest ~/Documents/paper.pdf
alxia ingest https://arxiv.org/pdf/2401.12345.pdf
alxia ingest https://en.wikipedia.org/wiki/Transformer_(deep_learning_architecture)

# Query your knowledge
alxia query "attention mechanisms"

# Connect to Claude Code
alxia mcp install claude-code

Sources

Alexandria ingests from 14 source types:

Source Command
Local file alxia ingest ~/file.md
PDF alxia ingest ~/paper.pdf
URL (HTML) alxia ingest https://example.com/article
URL (PDF) alxia ingest https://arxiv.org/pdf/2401.12345.pdf
Git repo alxia source add git-local --name repo --repo-url ~/project
GitHub alxia source add github --name repo --owner x --repo y
RSS/Atom alxia source add rss --name blog --feed-url https://example.com/feed
YouTube alxia source add youtube --name talks --urls "https://youtu.be/abc"
Notion alxia source add notion --name wiki --token-ref key --page-ids "abc"
HuggingFace alxia source add huggingface --name models --repos "meta-llama/Llama-3-8b"
Obsidian/Folder alxia source add folder --name vault --path ~/ObsidianVault
Zip/Tar alxia source add archive --name papers --path ~/papers.zip
IMAP alxia source add imap --name mail --imap-host imap.gmail.com --imap-user me
Clipboard alxia paste --title "note" --content "quick thought"

After adding sources, pull everything with:

alxia sync

MCP Integration

Alexandria exposes your knowledge to AI agents via the Model Context Protocol.

# Register with Claude Code (stdio transport)
alxia mcp install claude-code

# Or run the HTTP server for other clients
alxia mcp serve-http --port 7219

Available MCP tools: guide, overview, list, grep, search, read, follow, history, why, events, timeline, git_log, git_show, git_blame, sources, subscriptions.

CLI Reference

alxia init                    Initialize ~/.alexandria/
alxia status                  Operational dashboard
alxia doctor                  Health checks

alxia ingest <file-or-url>    Ingest a source (file, PDF, URL)
alxia query <question>        Search across all knowledge
alxia why <topic>             Belief explainability + provenance
alxia lint                    Find wiki rot (stale citations)
alxia synthesize              Generate temporal digest

alxia source add <type>       Add a source adapter
alxia source list             List configured sources
alxia sync                    Pull from all sources
alxia subscriptions poll      Poll RSS + IMAP
alxia subscriptions list      Show pending items

alxia workspace use <slug>    Switch workspace
alxia project create <name>   Create a project workspace

alxia secrets set <ref>       Store an encrypted secret
alxia hooks install <client>  Install capture hooks
alxia capture conversation    Capture an agent session

alxia eval run                Run quality metrics (M1-M5)
alxia daemon start            Start background scheduler
alxia logs show               View structured logs

Architecture

  • SQLite + filesystem hybrid: filesystem is source of truth for documents, SQLite for search/metadata/events
  • FTS5 for keyword search (no vectors, no RAG — the agent IS the retriever)
  • Hostile verifier: every wiki write is verified before commit (citations, quote anchors, cascade policy)
  • Belief revision: structured claims with supersession chains and provenance
  • AES-256-GCM vault: encrypted secrets with PBKDF2 key derivation
  • Structured JSONL logging with run_id correlation
  • 8 schema migrations applied automatically on init

See docs/architecture/ for the 20 architecture documents.

Docker

docker build -t alexandria .
docker run -v ~/.alexandria:/data alexandria init
docker run -v ~/.alexandria:/data alxia status

Development

git clone git@github.com:epappas/alexandria.git
cd alexandria
uv sync --dev
uv run pytest tests/       # 352 tests
./scripts/build.sh          # test + build
./scripts/publish.sh        # test + build + PyPI + git tag

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alexandria_wiki-0.11.0.tar.gz (534.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alexandria_wiki-0.11.0-py3-none-any.whl (197.0 kB view details)

Uploaded Python 3

File details

Details for the file alexandria_wiki-0.11.0.tar.gz.

File metadata

  • Download URL: alexandria_wiki-0.11.0.tar.gz
  • Upload date:
  • Size: 534.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for alexandria_wiki-0.11.0.tar.gz
Algorithm Hash digest
SHA256 08946979332f3adbf54197de9fc0154670f351352b362b0f8b970a172cb0fa01
MD5 35e6454c84c417a7516edac00a7d582a
BLAKE2b-256 be621128a0cd552075513a8a3418a683b530d5437614f9d41aa17795cb3a57a9

See more details on using hashes here.

File details

Details for the file alexandria_wiki-0.11.0-py3-none-any.whl.

File metadata

File hashes

Hashes for alexandria_wiki-0.11.0-py3-none-any.whl
Algorithm Hash digest
SHA256 450c04477df657c9b23662f7191c78dc825c9575643dc28ba3dfb3f8e7234339
MD5 f625dd60d678708938b7589368bd4d5c
BLAKE2b-256 287564bde1b9a4720d3b191c0d4fce7e94ca14681d2ab0543d20af3c6d77eec0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page