Skip to main content

Generate llms.txt markdown from FastAPI OpenAPI schema for AI agents

Project description

fast-llms-txt

PyPI version Python License codecov

Generate an llms.txt markdown manifest from your FastAPI OpenAPI schema for AI agents. This results in a ~40-50% size reduction vs the output of a OpenAPI spec JSON.

Inspired by the llms.txt specification for LLM-friendly documentation.

Why?

OpenAPI and FastAPI's support for it are excellent, but the specification is designed for deterministic machine interpretation. It must be complete and precise. Every schema, every $ref, every possible response. This results in a very large document.

AI agents have different needs:

  • Context windows are limited. A 50KB OpenAPI spec consumes tokens that could be used for reasoning.
  • Agents can infer. A Task mentioned in one endpoint is probably the same Task concept elsewhere. They don't need every relationship spelled out.
  • Agents recover from errors. If an API responds that foo needs to be an integer, the agent adapts. It doesn't need perfect type information upfront.

This project applies the llms.txt philosophy, Concise, readable documentation for LLMs—to APIs.

Installation

uv add fast-llms-txt

Usage

from fastapi import FastAPI
from fast_llms_txt import create_llms_txt_router

app = FastAPI(title="My API", description="A sample API")

@app.get("/users")
def list_users(limit: int = 10):
    """List all users."""
    return []

# Mount the llms.txt endpoint
app.include_router(create_llms_txt_router(app), prefix="/docs")

The prefix determines the final URL path. For example:

  • prefix="/docs"GET /docs/llms.txt
  • prefix="/api/v1/docs"GET /api/v1/docs/llms.txt

Now GET /docs/llms.txt returns:

# My API

> A sample API

## Endpoints

### `GET /users` - List all users.

- **Request Parameters**:
  - `limit` (integer, optional)
- **Returns** (200): Successful Response

API

create_llms_txt_router(app, path="/llms.txt")

Creates a FastAPI router that serves the llms.txt endpoint.

  • app: Your FastAPI application instance
  • path: The endpoint path (default: /llms.txt)

generate_llms_txt(openapi_schema)

Directly convert an OpenAPI schema dict to llms.txt markdown string.


Appendix: Release Procedure

Versioning

This project uses semantic versioning:

  • PATCH (0.1.x): Bug fixes, no API changes
  • MINOR (0.x.0): New features, backward compatible
  • MAJOR (x.0.0): Breaking API changes

Prerequisites

  • GitHub CLI installed and authenticated (gh auth login)

Release Steps

Run the release script:

./scripts/release.sh 0.2.0

This will:

  1. Update version in pyproject.toml and fast_llms_txt/__init__.py
  2. Show diff and prompt for confirmation
  3. Commit the version bump
  4. Prompt for release notes (or auto-generate from commits)
  5. Push and create GitHub release (triggers PyPI publish via GitHub Actions)

Infrastructure

  • PyPI: pypi.org/project/fast-llms-txt
  • Trusted Publishing: No tokens required; GitHub Actions authenticates via OIDC
  • Environment: release environment in GitHub repo settings restricts publishing to v* tags

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fast_llms_txt-0.5.5.tar.gz (6.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fast_llms_txt-0.5.5-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file fast_llms_txt-0.5.5.tar.gz.

File metadata

  • Download URL: fast_llms_txt-0.5.5.tar.gz
  • Upload date:
  • Size: 6.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fast_llms_txt-0.5.5.tar.gz
Algorithm Hash digest
SHA256 6decdef566c0509b2bd83c15dc6dd5209eec118e5e16a064b06e85f730fbe104
MD5 9d2d9347c937d8276571e1871c446ddd
BLAKE2b-256 8c050f58e3d1807168569131bd07ea27e9fe89b5569ab3921d92c77eacf94a6c

See more details on using hashes here.

Provenance

The following attestation bundles were made for fast_llms_txt-0.5.5.tar.gz:

Publisher: publish.yml on AlteredCraft/fast-llms-txt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fast_llms_txt-0.5.5-py3-none-any.whl.

File metadata

  • Download URL: fast_llms_txt-0.5.5-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fast_llms_txt-0.5.5-py3-none-any.whl
Algorithm Hash digest
SHA256 d4ffc84420c64a9be7dcc1392381a46840b26efd0ce9daad7348d31828cff03e
MD5 230916b291e09607123edc7cdb0fe138
BLAKE2b-256 cdfbc6e63bd35e433de7732708204f689adcd26d0b30ecd8bae8de8d1018e583

See more details on using hashes here.

Provenance

The following attestation bundles were made for fast_llms_txt-0.5.5-py3-none-any.whl:

Publisher: publish.yml on AlteredCraft/fast-llms-txt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page