Skip to main content

Generate OpenAPI documentation from pytest tests

Project description

pytest-swag

Generate OpenAPI documentation from pytest tests.

pytest-swag is a framework-agnostic pytest plugin that turns your existing API tests into living OpenAPI 3.0/3.1 documentation. Define your API spec inline with a fluent builder DSL, validate responses against it with jsonschema, and produce a complete OpenAPI document at the end of your test session.


English | 한국어


Installation

pip install pytest-swag

Optional extras:

pip install pytest-swag[yaml]       # YAML output support
pip install pytest-swag[requests]   # requests library adapter
pip install pytest-swag[dev]        # Development dependencies

Quick Start

def test_get_blog(swag):
    swag.path("/blogs/{id}").get("Retrieves a blog")
    swag.parameter("id", in_="path", schema={"type": "string"})
    swag.response(200, schema={
        "type": "object",
        "properties": {"id": {"type": "integer"}, "title": {"type": "string"}},
    })

    response = client.get("/blogs/1")
    swag.validate(response.status_code, response.json())

Run your tests with the --swag flag:

pytest --swag

This generates an openapi.json file containing your full API specification.

How It Works

  1. Define your API spec using the swag fixture's builder DSL
  2. Validate each response against the declared schema (jsonschema)
  3. Collect all validated operations across your test suite
  4. Generate a complete OpenAPI document at session end

Only tests that pass validation are included in the output. Failed tests are automatically excluded, keeping your documentation accurate.

Configuration

Via pyproject.toml

[tool.pytest-swag]
openapi = "3.1.0"
output_path = "docs/openapi.json"
output_format = "json"   # "json", "yaml", or "both"

[tool.pytest-swag.info]
title = "My API"
version = "1.0.0"

Via conftest.py fixture

import pytest

@pytest.fixture(scope="session")
def swag_config():
    return {
        "openapi": "3.1.0",
        "info": {"title": "My API", "version": "1.0.0"},
        "output_path": "docs/openapi.json",
        "output_format": "json",
        "servers": [{"url": "https://api.example.com/v1"}],
        "security": [{"BearerAuth": []}],
    }

Builder DSL Reference

Path & HTTP Methods

swag.path("/users").get("List users")
swag.path("/users").post("Create user")
swag.path("/users/{id}").put("Update user")
swag.path("/users/{id}").patch("Partial update")
swag.path("/users/{id}").delete("Delete user")

Parameters

# Path parameter (always required)
swag.parameter("id", in_="path", schema={"type": "integer"})

# Query parameter (optional by default)
swag.parameter("page", in_="query", schema={"type": "integer"})

# Required header
swag.parameter("X-Api-Key", in_="header", schema={"type": "string"}, required=True)

Request Body

swag.request_body(
    content_type="application/json",
    schema={
        "type": "object",
        "required": ["title"],
        "properties": {
            "title": {"type": "string"},
            "content": {"type": "string"},
        },
    },
)

Responses

# With schema
swag.response(200, description="OK", schema={
    "type": "object",
    "properties": {"id": {"type": "integer"}},
})

# Without schema (e.g. 204 No Content)
swag.response(204, description="Deleted")

# With $ref (requires swag_schemas fixture)
swag.response(200, schema={"$ref": "#/components/schemas/User"})

Tags & Security

swag.tag("Users")
swag.security("BearerAuth")

Validation

# Manual validation
swag.validate(response.status_code, response.json())

# Validates:
# 1. Status code is documented
# 2. Response body matches the declared schema (via jsonschema)

Capture (Schema-Free)

Record actual API responses for documentation without defining schemas upfront. Schemas are automatically inferred from the response body.

def test_get_blog(swag):
    swag.path("/blogs/{id}").get("Get blog")
    swag.parameter("id", in_="path", schema={"type": "string"})

    response = client.get("/blogs/1")
    assert response.status_code == 200       # validate with pytest
    assert "title" in response.json()

    swag.capture(200, response.json())       # capture for docs

# Disable schema inference (example only)
    swag.capture(200, response.json(), infer_schema=False)

The swag_requests fixture auto-captures on validate_response().

Note: capture() and validate() cannot be used in the same test.

Component Schemas ($ref Support)

Define reusable schemas via the swag_schemas fixture:

@pytest.fixture(scope="session")
def swag_schemas():
    return {
        "User": {
            "type": "object",
            "required": ["id", "name"],
            "properties": {
                "id": {"type": "integer"},
                "name": {"type": "string"},
                "email": {"type": "string", "format": "email"},
            },
        },
        "Error": {
            "type": "object",
            "properties": {
                "message": {"type": "string"},
            },
        },
    }

Then reference them in your tests:

def test_get_user(swag):
    swag.path("/users/{id}").get("Get user")
    swag.parameter("id", in_="path", schema={"type": "integer"})
    swag.response(200, schema={"$ref": "#/components/schemas/User"})
    swag.response(404, schema={"$ref": "#/components/schemas/Error"})

    response = client.get("/users/1")
    swag.validate(response.status_code, response.json())

The $ref references are recursively resolved during validation and preserved as-is in the generated OpenAPI document.

Security Schemes

@pytest.fixture(scope="session")
def swag_security_schemes():
    return {
        "BearerAuth": {
            "type": "http",
            "scheme": "bearer",
            "bearerFormat": "JWT",
        },
        "ApiKeyAuth": {
            "type": "apiKey",
            "in": "header",
            "name": "X-API-Key",
        },
    }

Requests Adapter

For projects using the requests library, use the swag_requests fixture for automatic response extraction:

def test_list_users(swag_requests):
    swag_requests.path("/users").get("List users")
    swag_requests.response(200, schema={
        "type": "array",
        "items": {"$ref": "#/components/schemas/User"},
    })

    response = requests.get("http://localhost:8000/users")
    swag_requests.validate_response(response)
    # Automatically extracts status_code and JSON body from the response object

For schema-free capture without validation:

def test_list_users(swag_requests):
    swag_requests.path("/users").get("List users")

    response = requests.get("http://localhost:8000/users")
    assert response.status_code == 200          # validate with pytest

    swag_requests.capture_response(response)    # capture for docs (schema auto-inferred)

run_test() — rswag-style Request + Capture

Send HTTP requests directly from the builder and auto-capture (or validate) the response in one call. Similar to rswag's run_test!.

# Provide a test client via fixture
@pytest.fixture
def swag_client():
    return app.test_client()

# Schema-free: capture only
def test_get_blog(swag_requests):
    swag_requests.path("/blogs/{id}").get("Get blog")
    swag_requests.parameter("id", in_="path", schema={"type": "string"}, value="1")

    response = swag_requests.run_test()       # sends GET /blogs/1, captures response
    assert response.json()["title"] == "Hello"

# With schema: validate + capture
def test_get_blog_validated(swag_requests):
    swag_requests.path("/blogs/{id}").get("Get blog")
    swag_requests.parameter("id", in_="path", schema={"type": "string"}, value="1")
    swag_requests.response(200, schema={
        "type": "object",
        "properties": {"id": {"type": "integer"}, "title": {"type": "string"}},
    })

    response = swag_requests.run_test()       # sends request, validates, captures
    assert response.json()["title"] == "Hello"

Without swag_client, run_test() uses the requests library with servers[0].url from swag_config:

@pytest.fixture(scope="session")
def swag_config():
    return {"servers": [{"url": "http://localhost:8000"}]}

Multi-Document Output

Generate multiple OpenAPI documents from a single test suite using swag.doc():

@pytest.fixture(scope="session")
def swag_config():
    return [
        {"info": {"title": "Public API", "version": "1.0.0"}, "output_path": "docs/public.json"},
        {"info": {"title": "Admin API", "version": "1.0.0"}, "output_path": "docs/admin.json"},
    ]

def test_public_endpoint(swag):
    swag.doc("Public API")
    swag.path("/posts").get("List posts")
    swag.response(200, schema={"type": "array"})
    swag.validate(200, [])

def test_admin_endpoint(swag):
    swag.doc("Admin API")
    swag.path("/admin/users").get("List all users")
    swag.response(200, schema={"type": "array"})
    swag.validate(200, [])

CLI Options

Option Description
--swag Enable OpenAPI document generation
--swag-output PATH Override the output file path
--swag-dry-run Print the OpenAPI document to stdout instead of writing a file
--swag-no-output Run validation only, skip file generation
--swag-strict Warn when a test uses the swag fixture but never calls validate()

Full Example

# conftest.py
import pytest

@pytest.fixture(scope="session")
def swag_config():
    return {
        "openapi": "3.1.0",
        "info": {"title": "Blog API", "version": "1.0.0"},
        "servers": [{"url": "https://api.example.com/v1"}],
        "security": [{"BearerAuth": []}],
        "output_path": "docs/openapi.json",
        "output_format": "both",
    }

@pytest.fixture(scope="session")
def swag_schemas():
    return {
        "Blog": {
            "type": "object",
            "required": ["id", "title"],
            "properties": {
                "id": {"type": "integer"},
                "title": {"type": "string"},
                "content": {"type": "string"},
            },
        },
    }

@pytest.fixture(scope="session")
def swag_security_schemes():
    return {
        "BearerAuth": {"type": "http", "scheme": "bearer", "bearerFormat": "JWT"},
    }
# test_blogs.py
def test_list_blogs(swag):
    swag.path("/blogs").get("List all blogs")
    swag.tag("Blogs")
    swag.parameter("page", in_="query", schema={"type": "integer"})
    swag.response(200, schema={
        "type": "array",
        "items": {"$ref": "#/components/schemas/Blog"},
    })

    response = client.get("/blogs")
    swag.validate(response.status_code, response.json())

def test_create_blog(swag):
    swag.path("/blogs").post("Create a blog")
    swag.tag("Blogs")
    swag.security("BearerAuth")
    swag.request_body(schema={
        "type": "object",
        "required": ["title"],
        "properties": {"title": {"type": "string"}, "content": {"type": "string"}},
    })
    swag.response(201, schema={"$ref": "#/components/schemas/Blog"})

    response = client.post("/blogs", json={"title": "Hello", "content": "World"})
    swag.validate(response.status_code, response.json())

def test_delete_blog(swag):
    swag.path("/blogs/{id}").delete("Delete a blog")
    swag.tag("Blogs")
    swag.parameter("id", in_="path", schema={"type": "integer"})
    swag.response(204, description="Deleted")

    response = client.delete("/blogs/1")
    swag.validate(response.status_code, None)
pytest --swag
# Generates docs/openapi.json and docs/openapi.yaml

Requirements

  • Python >= 3.10
  • pytest >= 7.0
  • jsonschema >= 4.0
  • PyYAML >= 6.0 (optional, for YAML output)

Acknowledgments

pytest-swag is inspired by rswag, the excellent Ruby/RSpec library for generating Swagger/OpenAPI documentation from integration tests. We are grateful to the rswag team for pioneering the "test-driven documentation" approach that bridges the gap between API testing and API documentation. pytest-swag brings this philosophy to the Python/pytest ecosystem.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytest_swag-0.4.0.tar.gz (73.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pytest_swag-0.4.0-py3-none-any.whl (24.9 kB view details)

Uploaded Python 3

File details

Details for the file pytest_swag-0.4.0.tar.gz.

File metadata

  • Download URL: pytest_swag-0.4.0.tar.gz
  • Upload date:
  • Size: 73.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for pytest_swag-0.4.0.tar.gz
Algorithm Hash digest
SHA256 38bdd65ea4d4812fdbe9f83164c922fa7089ff3859b7bd951f5f9f4c6fc02bf3
MD5 d9003be84ee4e6fba3e3b9a2c1474012
BLAKE2b-256 a19d2c93ba6ee54e2d06f1ab4e8fe0d43e7471673cd1049b9ffe6fc968902cce

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytest_swag-0.4.0.tar.gz:

Publisher: publish.yml on builder-shin/pytest-swag

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pytest_swag-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: pytest_swag-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 24.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for pytest_swag-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1ecced0ba30a4e2724a47890c6b350e6e4b486662ffc7d9f224eabab5c4f7e91
MD5 e3870b96a51f389d2faba6f13135c0f9
BLAKE2b-256 ef175e1f0f3839ef25e6c6b88c2ea55d3b8df7d834b059883df8c4b23be6be94

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytest_swag-0.4.0-py3-none-any.whl:

Publisher: publish.yml on builder-shin/pytest-swag

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page