Skip to main content

Generate llms.txt markdown from FastAPI OpenAPI schema for AI agents

Project description

fast-llms-txt

PyPI version Python License codecov

Generate an llms.txt markdown manifest from your FastAPI OpenAPI schema for AI agents. This results in a ~75% size reduction vs the output of a OpenAPI spec JSON.

Inspired by the llms.txt specification for LLM-friendly documentation.

Why?

OpenAPI and FastAPI's support for it are excellent, but the specification is designed for deterministic machine interpretation. It must be complete and precise. Every schema, every $ref, every possible response. This results in a very large document.

AI agents have different needs:

  • Context windows are limited. A 50KB OpenAPI spec consumes tokens that could be used for reasoning.
  • Agents can infer. A Task mentioned in one endpoint is probably the same Task concept elsewhere. They don't need every relationship spelled out.
  • Agents recover from errors. If an API responds that foo needs to be an integer, the agent adapts. It doesn't need perfect type information upfront.

This project applies the llms.txt philosophy, Concise, readable documentation for LLMs—to APIs.

Installation

uv add fast-llms-txt

Usage

from fastapi import FastAPI
from fast_llms_txt import create_llms_txt_router

app = FastAPI(title="My API", description="A sample API")

@app.get("/users")
def list_users(limit: int = 10):
    """List all users."""
    return []

# Mount the llms.txt endpoint
app.include_router(create_llms_txt_router(app), prefix="/docs")

The prefix determines the final URL path. For example:

  • prefix="/docs"GET /docs/llms.txt
  • prefix="/api/v1/docs"GET /api/v1/docs/llms.txt

Now GET /docs/llms.txt returns:

# My API

> A sample API

## Endpoints

### `GET /users` - List all users.

- **Request Parameters**:
  - `limit` (integer, optional)
- **Returns** (200): Successful Response

API

create_llms_txt_router(app, path="/llms.txt")

Creates a FastAPI router that serves the llms.txt endpoint.

  • app: Your FastAPI application instance
  • path: The endpoint path (default: /llms.txt)

generate_llms_txt(openapi_schema)

Directly convert an OpenAPI schema dict to llms.txt markdown string.


Appendix: Release Procedure

Versioning

This project uses semantic versioning:

  • PATCH (0.1.x): Bug fixes, no API changes
  • MINOR (0.x.0): New features, backward compatible
  • MAJOR (x.0.0): Breaking API changes

Prerequisites

  • GitHub CLI installed and authenticated (gh auth login)

Release Steps

Run the release script:

./scripts/release.sh 0.2.0

This will:

  1. Update version in pyproject.toml and fast_llms_txt/__init__.py
  2. Show diff and prompt for confirmation
  3. Commit the version bump
  4. Prompt for release notes (or auto-generate from commits)
  5. Push and create GitHub release (triggers PyPI publish via GitHub Actions)

Infrastructure

  • PyPI: pypi.org/project/fast-llms-txt
  • Trusted Publishing: No tokens required; GitHub Actions authenticates via OIDC
  • Environment: release environment in GitHub repo settings restricts publishing to v* tags

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fast_llms_txt-0.5.2.tar.gz (59.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fast_llms_txt-0.5.2-py3-none-any.whl (7.2 kB view details)

Uploaded Python 3

File details

Details for the file fast_llms_txt-0.5.2.tar.gz.

File metadata

  • Download URL: fast_llms_txt-0.5.2.tar.gz
  • Upload date:
  • Size: 59.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fast_llms_txt-0.5.2.tar.gz
Algorithm Hash digest
SHA256 434becd2b27a37c8b1c6bcd220462feb0840ef1415fcb99c8cb73b318f94e7a3
MD5 553d708c42d213e663051c2458d52eba
BLAKE2b-256 022515bcd5fbdbc8b83c673a16760ee37ca15cc3e01975df78ff6ace05a06c75

See more details on using hashes here.

Provenance

The following attestation bundles were made for fast_llms_txt-0.5.2.tar.gz:

Publisher: publish.yml on AlteredCraft/fast-llms-txt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fast_llms_txt-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: fast_llms_txt-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 7.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fast_llms_txt-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 bca18eb7c6073f3f6ecc48fd1091b7098060e3b14cb34b9c07ac290acfcc2579
MD5 806579fb91f120fb863582cbcdfc23dc
BLAKE2b-256 24027f541ab3b1c9a94e190ebf2be74d0ffcbc15b555f9dac4f4e7d60f552a16

See more details on using hashes here.

Provenance

The following attestation bundles were made for fast_llms_txt-0.5.2-py3-none-any.whl:

Publisher: publish.yml on AlteredCraft/fast-llms-txt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page