Agent-first documentation platform (CLI and server)
Project description
llmdocs
Agent-first documentation platform — self-hosted, MCP-native, no external vector API required.
- MCP tools via Streamable HTTP (
/mcp) for AI agents and IDEs - Hybrid search — Chroma semantic + BM25 keyword fusion
- Raw markdown URLs —
GET /guide.mdreturns clean content, no frontmatter llms.txtgeneration (coming soon)- Embedded Chroma — no external vector DB needed
Quickstart (CLI)
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt && pip install .
llmdocs --help
The llmdocs CLI is the only supported interface. Internal Python modules are implementation details and may change without notice.
Development
python -m venv .venv && source .venv/bin/activate
pip install -r requirements-dev.txt
pip install -e .
pytest
Git note: this repo is rooted in
Projects/llmdocs. If your editor opened a parent directory that is also a Git repo, run git commands from here so commits apply only to llmdocs.
Server routes
| Route | Description |
|---|---|
GET / |
JSON metadata and endpoint index |
GET /health |
{"status": "healthy"} |
POST /mcp |
Streamable HTTP MCP endpoint (FastMCP) |
GET /<path>.md |
Raw markdown body, no frontmatter |
MCP tools
Connect any MCP-compatible client (Cursor, Claude, etc.) to http://localhost:8080/mcp.
| Tool | Description |
|---|---|
search_docs |
Hybrid semantic + keyword search over indexed chunks |
get_doc |
Fetch full document body and metadata by path |
list_docs |
List all documents, optionally filtered by category or path prefix |
Package layout
llmdocs/
cli.py User entry point (Click commands)
config.py llmdocs.yaml loading
models.py Internal types (Document, Chunk, SearchResult)
doc_paths.py Safe URL → filesystem resolution
server.py FastAPI app + lifespan startup indexing
indexing/ Indexing pipeline
parser.py Frontmatter + markdown loading
chunker.py H2/H3 section chunking
hasher.py SHA-256 file hashing for incremental indexing
indexer.py Chroma store + sentence-transformers embeddings
search.py HybridSearchEngine (semantic + BM25)
mcp/ MCP layer (FastMCP, Streamable HTTP at /mcp)
__init__.py FastMCP server singleton + tool registrations
runtime.py LlmdocsRuntime — state shared between lifespan and tools
tools.py Tool logic (pure functions)
Docker
The production image installs requirements.txt then pip install .. A Dockerfile and docker-compose.yml will be added in a later task.
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llmdocs_mcp-0.1.0.tar.gz.
File metadata
- Download URL: llmdocs_mcp-0.1.0.tar.gz
- Upload date:
- Size: 26.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
034e7ffdb3673c21c9ff91373254be8bb3bea5d3118f8e5e09224d234f1383dc
|
|
| MD5 |
455762355804d416111c02b9405519c1
|
|
| BLAKE2b-256 |
2282035cea735c1912e0559a08706712e97a0da00305a0c0f218083bffd2dbb3
|
Provenance
The following attestation bundles were made for llmdocs_mcp-0.1.0.tar.gz:
Publisher:
publish.yml on vinny380/llmdocs
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llmdocs_mcp-0.1.0.tar.gz -
Subject digest:
034e7ffdb3673c21c9ff91373254be8bb3bea5d3118f8e5e09224d234f1383dc - Sigstore transparency entry: 1183991414
- Sigstore integration time:
-
Permalink:
vinny380/llmdocs@adcf7423e42369287948fb918baad04a212ba024 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/vinny380
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@adcf7423e42369287948fb918baad04a212ba024 -
Trigger Event:
push
-
Statement type:
File details
Details for the file llmdocs_mcp-0.1.0-py3-none-any.whl.
File metadata
- Download URL: llmdocs_mcp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 22.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f6f4d13374ef7055783c6820d672a03c97875a5c949c65ef29d51d652a947dae
|
|
| MD5 |
fda84ed60aa22e9b779e0d7c17c18925
|
|
| BLAKE2b-256 |
9bb4e94089cbad7f9851091cd784b5691a8c14b3d8c807ed5c44a0038720195d
|
Provenance
The following attestation bundles were made for llmdocs_mcp-0.1.0-py3-none-any.whl:
Publisher:
publish.yml on vinny380/llmdocs
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llmdocs_mcp-0.1.0-py3-none-any.whl -
Subject digest:
f6f4d13374ef7055783c6820d672a03c97875a5c949c65ef29d51d652a947dae - Sigstore transparency entry: 1183991643
- Sigstore integration time:
-
Permalink:
vinny380/llmdocs@adcf7423e42369287948fb918baad04a212ba024 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/vinny380
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@adcf7423e42369287948fb918baad04a212ba024 -
Trigger Event:
push
-
Statement type: