Skip to main content

Local-first single-user knowledge engine — accumulates gathered knowledge and exposes it via MCP to connected agents like Claude Code.

Project description

Alexandria

Local-first single-user knowledge engine. Accumulates your gathered knowledge (raw sources, compiled wiki pages, event streams, AI conversations) and exposes it via MCP to connected agents like Claude Code for retroactive query, retrieval, and synthesis.

Alexandria is not a chat client. Interactive conversations happen in your existing MCP-capable agent (Claude Code, Cursor, Codex). Alexandria is the knowledge engine those agents connect to.

Install

pip install alexandria-wiki          # core
pip install "alexandria-wiki[pdf]"   # + PDF support
pip install "alexandria-wiki[all]"   # + PDF + YouTube transcripts

Or with uv:

uvx alexandria-wiki
# or
uv tool install alexandria-wiki

Quick Start

# Initialize
alxia init
alxia status

# Create a project workspace
alxia project create my-research --description "ML papers"
alxia workspace use my-research

# Ingest from anywhere
alxia ingest ~/Documents/paper.pdf
alxia ingest https://arxiv.org/pdf/2401.12345.pdf
alxia ingest https://en.wikipedia.org/wiki/Transformer_(deep_learning_architecture)

# Query your knowledge
alxia query "attention mechanisms"

# Connect to Claude Code
alxia mcp install claude-code

Sources

Alexandria ingests from 14 source types:

Source Command
Local file alxia ingest ~/file.md
PDF alxia ingest ~/paper.pdf
URL (HTML) alxia ingest https://example.com/article
URL (PDF) alxia ingest https://arxiv.org/pdf/2401.12345.pdf
Git repo alxia source add git-local --name repo --repo-url ~/project
GitHub alxia source add github --name repo --owner x --repo y
RSS/Atom alxia source add rss --name blog --feed-url https://example.com/feed
YouTube alxia source add youtube --name talks --urls "https://youtu.be/abc"
Notion alxia source add notion --name wiki --token-ref key --page-ids "abc"
HuggingFace alxia source add huggingface --name models --repos "meta-llama/Llama-3-8b"
Obsidian/Folder alxia source add folder --name vault --path ~/ObsidianVault
Zip/Tar alxia source add archive --name papers --path ~/papers.zip
IMAP alxia source add imap --name mail --imap-host imap.gmail.com --imap-user me
Clipboard alxia paste --title "note" --content "quick thought"

After adding sources, pull everything with:

alxia sync

MCP Integration

Alexandria exposes your knowledge to AI agents via the Model Context Protocol.

# Register with Claude Code (stdio transport)
alxia mcp install claude-code

# Or run the HTTP server for other clients
alxia mcp serve-http --port 7219

Available MCP tools: guide, overview, list, grep, search, read, follow, history, why, events, timeline, git_log, git_show, git_blame, sources, subscriptions.

CLI Reference

alxia init                    Initialize ~/.alexandria/
alxia status                  Operational dashboard
alxia doctor                  Health checks

alxia ingest <file-or-url>    Ingest a source (file, PDF, URL)
alxia query <question>        Search across all knowledge
alxia why <topic>             Belief explainability + provenance
alxia lint                    Find wiki rot (stale citations)
alxia synthesize              Generate temporal digest

alxia source add <type>       Add a source adapter
alxia source list             List configured sources
alxia sync                    Pull from all sources
alxia subscriptions poll      Poll RSS + IMAP
alxia subscriptions list      Show pending items

alxia workspace use <slug>    Switch workspace
alxia project create <name>   Create a project workspace

alxia secrets set <ref>       Store an encrypted secret
alxia hooks install <client>  Install capture hooks
alxia capture conversation    Capture an agent session

alxia eval run                Run quality metrics (M1-M5)
alxia daemon start            Start background scheduler
alxia logs show               View structured logs

Architecture

  • SQLite + filesystem hybrid: filesystem is source of truth for documents, SQLite for search/metadata/events
  • FTS5 for keyword search (no vectors, no RAG — the agent IS the retriever)
  • Hostile verifier: every wiki write is verified before commit (citations, quote anchors, cascade policy)
  • Belief revision: structured claims with supersession chains and provenance
  • AES-256-GCM vault: encrypted secrets with PBKDF2 key derivation
  • Structured JSONL logging with run_id correlation
  • 8 schema migrations applied automatically on init

See docs/architecture/ for the 20 architecture documents.

Docker

docker build -t alexandria .
docker run -v ~/.alexandria:/data alexandria init
docker run -v ~/.alexandria:/data alxia status

Development

git clone git@github.com:epappas/alexandria.git
cd alexandria
uv sync --dev
uv run pytest tests/       # 352 tests
./scripts/build.sh          # test + build
./scripts/publish.sh        # test + build + PyPI + git tag

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alexandria_wiki-0.15.0.tar.gz (538.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alexandria_wiki-0.15.0-py3-none-any.whl (203.0 kB view details)

Uploaded Python 3

File details

Details for the file alexandria_wiki-0.15.0.tar.gz.

File metadata

  • Download URL: alexandria_wiki-0.15.0.tar.gz
  • Upload date:
  • Size: 538.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for alexandria_wiki-0.15.0.tar.gz
Algorithm Hash digest
SHA256 8847daa69e0053c3e13e1701598f11895fd82dbedd399355e03c63cda6cf174c
MD5 ae9161911102e68436bf711412fad184
BLAKE2b-256 bd549315c2b07a1f38a25a15f58fa66e997be5ed7b1c5be5ad4c4ec47ecd0a4d

See more details on using hashes here.

File details

Details for the file alexandria_wiki-0.15.0-py3-none-any.whl.

File metadata

File hashes

Hashes for alexandria_wiki-0.15.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1397f94b4aeedb7a7899be391a8080066fc8f9010863edc93ef4a832937ab393
MD5 1da9c793a427f90b575742f05622648c
BLAKE2b-256 5a964657d86c383dea3453e1415c1e22be3034ae092c63bec4a7fde0d38af8cd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page