Skip to main content

Agent-first documentation platform (CLI and server)

Project description

llmdocs

Agent-first documentation platform — self-hosted, MCP-native, no external vector API required.

  • MCP tools via Streamable HTTP (/mcp) for AI agents and IDEs
  • Hybrid search — Chroma semantic + BM25 keyword fusion
  • Raw markdown URLsGET /guide.md returns clean content, no frontmatter
  • llms.txt generation (coming soon)
  • Embedded Chroma — no external vector DB needed

Quickstart (CLI)

python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt && pip install .
llmdocs --help

The llmdocs CLI is the only supported interface. Internal Python modules are implementation details and may change without notice.


Development

python -m venv .venv && source .venv/bin/activate
pip install -r requirements-dev.txt
pip install -e .
pytest

Git note: this repo is rooted in Projects/llmdocs. If your editor opened a parent directory that is also a Git repo, run git commands from here so commits apply only to llmdocs.


Server routes

Route Description
GET / JSON metadata and endpoint index
GET /health {"status": "healthy"}
POST /mcp Streamable HTTP MCP endpoint (FastMCP)
GET /<path>.md Raw markdown body, no frontmatter

MCP tools

Connect any MCP-compatible client (Cursor, Claude, etc.) to http://localhost:8080/mcp.

Tool Description
search_docs Hybrid semantic + keyword search over indexed chunks
get_doc Fetch full document body and metadata by path
list_docs List all documents, optionally filtered by category or path prefix

Package layout

llmdocs/
  cli.py          User entry point (Click commands)
  config.py       llmdocs.yaml loading
  models.py       Internal types (Document, Chunk, SearchResult)
  doc_paths.py    Safe URL → filesystem resolution
  server.py       FastAPI app + lifespan startup indexing

  indexing/       Indexing pipeline
    parser.py     Frontmatter + markdown loading
    chunker.py    H2/H3 section chunking
    hasher.py     SHA-256 file hashing for incremental indexing
    indexer.py    Chroma store + sentence-transformers embeddings
    search.py     HybridSearchEngine (semantic + BM25)

  mcp/            MCP layer (FastMCP, Streamable HTTP at /mcp)
    __init__.py   FastMCP server singleton + tool registrations
    runtime.py    LlmdocsRuntime — state shared between lifespan and tools
    tools.py      Tool logic (pure functions)

Docker

The production image installs requirements.txt then pip install .. A Dockerfile and docker-compose.yml will be added in a later task.


License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmdocs_mcp-0.1.1.tar.gz (26.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmdocs_mcp-0.1.1-py3-none-any.whl (22.9 kB view details)

Uploaded Python 3

File details

Details for the file llmdocs_mcp-0.1.1.tar.gz.

File metadata

  • Download URL: llmdocs_mcp-0.1.1.tar.gz
  • Upload date:
  • Size: 26.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llmdocs_mcp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 d0fea27f7a825dfc082989639b2c24e9397944cef331913fb163d325cb24cd33
MD5 6b63302db8f0fca01cb7a4067c297708
BLAKE2b-256 b15c148e1c55bf0f02d1400c876ccc7398b8d4fdf2d9298851ae198ae55a3707

See more details on using hashes here.

Provenance

The following attestation bundles were made for llmdocs_mcp-0.1.1.tar.gz:

Publisher: publish.yml on vinny380/llmdocs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llmdocs_mcp-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: llmdocs_mcp-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 22.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llmdocs_mcp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 692c59c2ece7ed2f8ff4c62ca1f8412d718a9f73058f06cb355882aab10a3190
MD5 dd52b733edc9f04cb86c53d0c024c3ea
BLAKE2b-256 0e390080147d6d47df5203496207f4c8eecca4247b09308d13d860e392b406da

See more details on using hashes here.

Provenance

The following attestation bundles were made for llmdocs_mcp-0.1.1-py3-none-any.whl:

Publisher: publish.yml on vinny380/llmdocs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page