Skip to main content

Agent-first documentation platform (CLI and server)

Project description

llmdocs

Agent-first documentation platform — self-hosted, MCP-native, no external vector API required.

  • MCP tools via Streamable HTTP (/mcp) for AI agents and IDEs
  • Hybrid search — Chroma semantic + BM25 keyword fusion
  • Raw markdown URLsGET /guide.md returns clean content, no frontmatter
  • llms.txt generation (coming soon)
  • Embedded Chroma — no external vector DB needed

Quickstart (CLI)

python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt && pip install .
llmdocs --help

The llmdocs CLI is the only supported interface. Internal Python modules are implementation details and may change without notice.


Development

python -m venv .venv && source .venv/bin/activate
pip install -r requirements-dev.txt
pip install -e .
pytest

Git note: this repo is rooted in Projects/llmdocs. If your editor opened a parent directory that is also a Git repo, run git commands from here so commits apply only to llmdocs.


Server routes

Route Description
GET / JSON metadata and endpoint index
GET /health {"status": "healthy"}
POST /mcp Streamable HTTP MCP endpoint (FastMCP)
GET /<path>.md Raw markdown body, no frontmatter

MCP tools

Connect any MCP-compatible client (Cursor, Claude, etc.) to http://localhost:8080/mcp.

Tool Description
search_docs Hybrid semantic + keyword search over indexed chunks
get_doc Fetch full document body and metadata by path
list_docs List all documents, optionally filtered by category or path prefix

Package layout

llmdocs/
  cli.py          User entry point (Click commands)
  config.py       llmdocs.yaml loading
  models.py       Internal types (Document, Chunk, SearchResult)
  doc_paths.py    Safe URL → filesystem resolution
  server.py       FastAPI app + lifespan startup indexing

  indexing/       Indexing pipeline
    parser.py     Frontmatter + markdown loading
    chunker.py    H2/H3 section chunking
    hasher.py     SHA-256 file hashing for incremental indexing
    indexer.py    Chroma store + sentence-transformers embeddings
    search.py     HybridSearchEngine (semantic + BM25)

  mcp/            MCP layer (FastMCP, Streamable HTTP at /mcp)
    __init__.py   FastMCP server singleton + tool registrations
    runtime.py    LlmdocsRuntime — state shared between lifespan and tools
    tools.py      Tool logic (pure functions)

Docker

The production image installs requirements.txt then pip install .. A Dockerfile and docker-compose.yml will be added in a later task.


License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmdocs_mcp-0.1.0.tar.gz (26.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmdocs_mcp-0.1.0-py3-none-any.whl (22.9 kB view details)

Uploaded Python 3

File details

Details for the file llmdocs_mcp-0.1.0.tar.gz.

File metadata

  • Download URL: llmdocs_mcp-0.1.0.tar.gz
  • Upload date:
  • Size: 26.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llmdocs_mcp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 034e7ffdb3673c21c9ff91373254be8bb3bea5d3118f8e5e09224d234f1383dc
MD5 455762355804d416111c02b9405519c1
BLAKE2b-256 2282035cea735c1912e0559a08706712e97a0da00305a0c0f218083bffd2dbb3

See more details on using hashes here.

Provenance

The following attestation bundles were made for llmdocs_mcp-0.1.0.tar.gz:

Publisher: publish.yml on vinny380/llmdocs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llmdocs_mcp-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: llmdocs_mcp-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 22.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llmdocs_mcp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f6f4d13374ef7055783c6820d672a03c97875a5c949c65ef29d51d652a947dae
MD5 fda84ed60aa22e9b779e0d7c17c18925
BLAKE2b-256 9bb4e94089cbad7f9851091cd784b5691a8c14b3d8c807ed5c44a0038720195d

See more details on using hashes here.

Provenance

The following attestation bundles were made for llmdocs_mcp-0.1.0-py3-none-any.whl:

Publisher: publish.yml on vinny380/llmdocs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page