Skip to main content

Automatically generate Swagger/OpenAPI documentation for Express.js APIs using NLP and LLMs

Project description

Auto Swagger Documentation Generator

A sophisticated tool that automatically generates Swagger/OpenAPI documentation for Express.js APIs using advanced NLP techniques and LLMs.

Overview

This project combines Natural Language Processing (NLP) techniques with Large Language Models (LLMs) to automatically generate high-quality API documentation. By preprocessing code with NLP before sending it to LLMs, we achieve:

  • Better context understanding
  • Reduced token usage
  • More specific and higher quality responses
  • Fine-grained control over the documentation pipeline
  • Automated code base updates

Features

  • Automatic API route detection
  • Intelligent parameter inference
  • Response schema generation
  • Validation rules detection
  • Swagger/OpenAPI compliant output
  • Support for Express.js routes
  • Automated documentation insertion

Architecture

Architecture Diagram

Setup

  1. Clone the repository:
git clone https://github.com/yourusername/auto_swagger.git
cd auto_swagger
  1. Install UV if you haven't already:
curl -LsSf https://astral.sh/uv/install.sh | sh
  1. Create a virtual environment and install dependencies:
uv venv
  1. Run the auto-swagger tool:
# Run the main documentation generator
uv run auto-swagger --repo-path path/to/express/app

# Run the fine-tuning tool (if needed)
uv run finetune

CLI Usage

Basic Usage

# Generate documentation for the current directory
uv run auto-swagger

# Generate documentation for a specific repository
uv run auto-swagger --repo-path /path/to/express/app

Command Line Arguments

  • --repo-path (optional): Path to the repository root. Defaults to current working directory.

    uv run auto-swagger --repo-path /Users/username/projects/my-api
    
  • --branch (optional): Branch to check for unmerged changes. Defaults to current branch.

    uv run auto-swagger --repo-path /path/to/repo --branch feature/new-endpoints
    
  • --model (optional): Hugging Face model name to use for generation. Overrides the default model in config.

    # Use Google Gemma model
    uv run auto-swagger --repo-path /path/to/repo --model "google/gemma-2-2b-it"
    
    # Use DeepSeek Coder model
    uv run auto-swagger --repo-path /path/to/repo --model "deepseek-ai/deepseek-coder-1.3b-instruct"
    
  • --lora-adapter (optional): LoRA adapter ID from Hugging Face. Use none to disable LoRA adapter and use base model only.

    # Use a custom LoRA adapter
    uv run auto-swagger --repo-path /path/to/repo --lora-adapter "username/my-adapter"
    
    # Disable LoRA adapter (use base model only)
    uv run auto-swagger --repo-path /path/to/repo --lora-adapter none
    

Example Commands

# Full example with all options
uv run auto-swagger \
  --repo-path "/Users/username/projects/my-api" \
  --branch "main" \
  --model "google/gemma-2-2b-it" \
  --lora-adapter none

# Use default model but disable LoRA adapter
uv run auto-swagger \
  --repo-path "/path/to/repo" \
  --lora-adapter none

# Use a different model with custom LoRA adapter
uv run auto-swagger \
  --repo-path "/path/to/repo" \
  --model "google/gemma-2-2b-it" \
  --lora-adapter "username/custom-adapter"

Project Structure

auto_swagger/
├── src/
│   └── auto_swagger/         # Source code
│       ├── config/          # Configuration management
│       ├── finetune/        # Model fine-tuning utilities
│       ├── parser/          # Code parsing and analysis
│       └── swagger_generator/ # Documentation generation
├── data/                    # Project data
│   ├── jsdocs_finetune.jsonl # Fine-tuning dataset
│   └── swagger_docs/        # Generated documentation
├── pyproject.toml          # Project configuration
└── README.md

Configuration

Default Model Configuration

The project uses a config with default model settings:

@dataclass
class LLMConfig:
    model_name: str = "deepseek-ai/deepseek-coder-1.3b-instruct"
    lora_adapter_id: Optional[str] = "paulopasso/auto-swagger"  # Default LoRA adapter
    max_new_tokens: int = 8192
    temperature: float = 0.2
    top_k: int = 50
    top_p: float = 0.95
    max_retries: int = 3

Overriding Configuration via CLI

You can override the model and LoRA adapter settings using command-line arguments (see CLI Usage above) without modifying the code:

# Use a different model
uv run auto-swagger --repo-path /path/to/repo --model "google/gemma-2-2b-it"

# Disable LoRA adapter
uv run auto-swagger --repo-path /path/to/repo --lora-adapter none

# Use custom model and adapter
uv run auto-swagger --repo-path /path/to/repo \
  --model "google/gemma-2-2b-it" \
  --lora-adapter "username/my-adapter"

Advanced Configuration

For advanced configuration (temperature, top_k, top_p, etc.), you can modify src/auto_swagger/swagger_generator/generator_config.py.

Future Improvements

  • Support for additional backend frameworks beyond Express.js
  • Local CLI version without GitHub app dependency
  • Enhanced pattern recognition
  • Additional documentation formats
  • Real-time documentation updates

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

auto_swagger-0.1.1.tar.gz (387.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

auto_swagger-0.1.1-py3-none-any.whl (35.6 kB view details)

Uploaded Python 3

File details

Details for the file auto_swagger-0.1.1.tar.gz.

File metadata

  • Download URL: auto_swagger-0.1.1.tar.gz
  • Upload date:
  • Size: 387.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.9.18 {"installer":{"name":"uv","version":"0.9.18","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for auto_swagger-0.1.1.tar.gz
Algorithm Hash digest
SHA256 85c4f22b47ff67d59e87452fbe0f3f5389ec5f007d7ef58baf4590843cb586be
MD5 f9ae23030842e3d461709f358a1d9d13
BLAKE2b-256 29e1fa7b8ed69d3d86706d0ad2757e8fb6f6f9bcd2ae63f988f6f7d042b65eaf

See more details on using hashes here.

File details

Details for the file auto_swagger-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: auto_swagger-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 35.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.9.18 {"installer":{"name":"uv","version":"0.9.18","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for auto_swagger-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c6aff0fa7ad11703dfdafaf1cdc490d6582300e99d24c96b811472751221eba7
MD5 69356c200448b5fb415330f442204270
BLAKE2b-256 703180cd279cb90c9e57fa2c645f0e168d0e9b126e18ba1fea4334400a392591

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page