Skip to main content

Local-first single-user knowledge engine — accumulates gathered knowledge and exposes it via MCP to connected agents like Claude Code.

Project description

Alexandria

Local-first single-user knowledge engine. Accumulates your gathered knowledge (raw sources, compiled wiki pages, event streams, AI conversations) and exposes it via MCP to connected agents like Claude Code for retroactive query, retrieval, and synthesis.

Alexandria is not a chat client. Interactive conversations happen in your existing MCP-capable agent (Claude Code, Cursor, Codex). Alexandria is the knowledge engine those agents connect to.

Install

pip install alexandria-wiki          # core
pip install "alexandria-wiki[pdf]"   # + PDF support
pip install "alexandria-wiki[all]"   # + PDF + YouTube transcripts

Or with uv:

uvx alexandria-wiki
# or
uv tool install alexandria-wiki

Quick Start

# Initialize
alxia init
alxia status

# Create a project workspace
alxia project create my-research --description "ML papers"
alxia workspace use my-research

# Ingest from anywhere
alxia ingest ~/Documents/paper.pdf
alxia ingest https://arxiv.org/pdf/2401.12345.pdf
alxia ingest https://en.wikipedia.org/wiki/Transformer_(deep_learning_architecture)

# Query your knowledge
alxia query "attention mechanisms"

# Connect to Claude Code
alxia mcp install claude-code

Sources

Alexandria ingests from 14 source types:

Source Command
Local file alxia ingest ~/file.md
PDF alxia ingest ~/paper.pdf
URL (HTML) alxia ingest https://example.com/article
URL (PDF) alxia ingest https://arxiv.org/pdf/2401.12345.pdf
Git repo alxia source add git-local --name repo --repo-url ~/project
GitHub alxia source add github --name repo --owner x --repo y
RSS/Atom alxia source add rss --name blog --feed-url https://example.com/feed
YouTube alxia source add youtube --name talks --urls "https://youtu.be/abc"
Notion alxia source add notion --name wiki --token-ref key --page-ids "abc"
HuggingFace alxia source add huggingface --name models --repos "meta-llama/Llama-3-8b"
Obsidian/Folder alxia source add folder --name vault --path ~/ObsidianVault
Zip/Tar alxia source add archive --name papers --path ~/papers.zip
IMAP alxia source add imap --name mail --imap-host imap.gmail.com --imap-user me
Clipboard alxia paste --title "note" --content "quick thought"

After adding sources, pull everything with:

alxia sync

MCP Integration

Alexandria exposes your knowledge to AI agents via the Model Context Protocol.

# Register with Claude Code (stdio transport)
alxia mcp install claude-code

# Or run the HTTP server for other clients
alxia mcp serve-http --port 7219

Available MCP tools: guide, overview, list, grep, search, read, follow, history, why, events, timeline, git_log, git_show, git_blame, sources, subscriptions.

CLI Reference

alxia init                    Initialize ~/.alexandria/
alxia status                  Operational dashboard
alxia doctor                  Health checks

alxia ingest <file-or-url>    Ingest a source (file, PDF, URL)
alxia query <question>        Search across all knowledge
alxia why <topic>             Belief explainability + provenance
alxia lint                    Find wiki rot (stale citations)
alxia synthesize              Generate temporal digest

alxia source add <type>       Add a source adapter
alxia source list             List configured sources
alxia sync                    Pull from all sources
alxia subscriptions poll      Poll RSS + IMAP
alxia subscriptions list      Show pending items

alxia workspace use <slug>    Switch workspace
alxia project create <name>   Create a project workspace

alxia secrets set <ref>       Store an encrypted secret
alxia hooks install <client>  Install capture hooks
alxia capture conversation    Capture an agent session

alxia eval run                Run quality metrics (M1-M5)
alxia daemon start            Start background scheduler
alxia logs show               View structured logs

Architecture

  • SQLite + filesystem hybrid: filesystem is source of truth for documents, SQLite for search/metadata/events
  • FTS5 for keyword search (no vectors, no RAG — the agent IS the retriever)
  • Hostile verifier: every wiki write is verified before commit (citations, quote anchors, cascade policy)
  • Belief revision: structured claims with supersession chains and provenance
  • AES-256-GCM vault: encrypted secrets with PBKDF2 key derivation
  • Structured JSONL logging with run_id correlation
  • 8 schema migrations applied automatically on init

See docs/architecture/ for the 20 architecture documents.

Docker

docker build -t alexandria .
docker run -v ~/.alexandria:/data alexandria init
docker run -v ~/.alexandria:/data alxia status

Development

git clone git@github.com:epappas/alexandria.git
cd alexandria
uv sync --dev
uv run pytest tests/       # 352 tests
./scripts/build.sh          # test + build
./scripts/publish.sh        # test + build + PyPI + git tag

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alexandria_wiki-0.19.1.tar.gz (567.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alexandria_wiki-0.19.1-py3-none-any.whl (215.4 kB view details)

Uploaded Python 3

File details

Details for the file alexandria_wiki-0.19.1.tar.gz.

File metadata

  • Download URL: alexandria_wiki-0.19.1.tar.gz
  • Upload date:
  • Size: 567.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for alexandria_wiki-0.19.1.tar.gz
Algorithm Hash digest
SHA256 ffd898e15f514c0fe7972c0f6f293568f175f1acb2c7aafcd89eca0af5976c65
MD5 607d873c6573c705582fa9402d39bc7a
BLAKE2b-256 126e467d43f34602df7dea1c95f18f2f23648f2099b3d059cc4b3f8c481fb0c5

See more details on using hashes here.

File details

Details for the file alexandria_wiki-0.19.1-py3-none-any.whl.

File metadata

File hashes

Hashes for alexandria_wiki-0.19.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3c01f5c5fc1586a1f49499081ebae98a6ee4f126c15306ce6c0df3728b3208e0
MD5 475f5c6689cc26f8f8856d28a2271f57
BLAKE2b-256 9da961c0410a766716819797d7da87405e14741e03a6ca14edd1794631636d6c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page