Automatically generate Swagger/OpenAPI documentation for Express.js APIs using NLP and LLMs
Project description
Auto Swagger Documentation Generator
A sophisticated tool that automatically generates Swagger/OpenAPI documentation for Express.js APIs using advanced NLP techniques and LLMs.
Overview
This project combines Natural Language Processing (NLP) techniques with Large Language Models (LLMs) to automatically generate high-quality API documentation. By preprocessing code with NLP before sending it to LLMs, we achieve:
- Better context understanding
- Reduced token usage
- More specific and higher quality responses
- Fine-grained control over the documentation pipeline
- Automated code base updates
Features
- Automatic API route detection
- Intelligent parameter inference
- Response schema generation
- Validation rules detection
- Swagger/OpenAPI compliant output
- Support for Express.js routes
- Automated documentation insertion
Architecture
Setup
- Clone the repository:
git clone https://github.com/yourusername/auto_swagger.git
cd auto_swagger
- Install UV if you haven't already:
curl -LsSf https://astral.sh/uv/install.sh | sh
- Create a virtual environment and install dependencies:
uv venv
- Run the auto-swagger tool:
# Run the main documentation generator
uv run auto-swagger --repo-path path/to/express/app
# Run the fine-tuning tool (if needed)
uv run finetune
CLI Usage
Basic Usage
# Generate documentation for the current directory
uv run auto-swagger
# Generate documentation for a specific repository
uv run auto-swagger --repo-path /path/to/express/app
Command Line Arguments
-
--repo-path(optional): Path to the repository root. Defaults to current working directory.uv run auto-swagger --repo-path /Users/username/projects/my-api
-
--branch(optional): Branch to check for unmerged changes. Defaults to current branch.uv run auto-swagger --repo-path /path/to/repo --branch feature/new-endpoints
-
--model(optional): Hugging Face model name to use for generation. Overrides the default model in config.# Use Google Gemma model uv run auto-swagger --repo-path /path/to/repo --model "google/gemma-2-2b-it" # Use DeepSeek Coder model uv run auto-swagger --repo-path /path/to/repo --model "deepseek-ai/deepseek-coder-1.3b-instruct"
-
--lora-adapter(optional): LoRA adapter ID from Hugging Face. Usenoneto disable LoRA adapter and use base model only.# Use a custom LoRA adapter uv run auto-swagger --repo-path /path/to/repo --lora-adapter "username/my-adapter" # Disable LoRA adapter (use base model only) uv run auto-swagger --repo-path /path/to/repo --lora-adapter none
Example Commands
# Full example with all options
uv run auto-swagger \
--repo-path "/Users/username/projects/my-api" \
--branch "main" \
--model "google/gemma-2-2b-it" \
--lora-adapter none
# Use default model but disable LoRA adapter
uv run auto-swagger \
--repo-path "/path/to/repo" \
--lora-adapter none
# Use a different model with custom LoRA adapter
uv run auto-swagger \
--repo-path "/path/to/repo" \
--model "google/gemma-2-2b-it" \
--lora-adapter "username/custom-adapter"
Project Structure
auto_swagger/
├── src/
│ └── auto_swagger/ # Source code
│ ├── config/ # Configuration management
│ ├── finetune/ # Model fine-tuning utilities
│ ├── parser/ # Code parsing and analysis
│ └── swagger_generator/ # Documentation generation
├── data/ # Project data
│ ├── jsdocs_finetune.jsonl # Fine-tuning dataset
│ └── swagger_docs/ # Generated documentation
├── pyproject.toml # Project configuration
└── README.md
Configuration
Default Model Configuration
The project uses a config with default model settings:
@dataclass
class LLMConfig:
model_name: str = "deepseek-ai/deepseek-coder-1.3b-instruct"
lora_adapter_id: Optional[str] = "paulopasso/auto-swagger" # Default LoRA adapter
max_new_tokens: int = 8192
temperature: float = 0.2
top_k: int = 50
top_p: float = 0.95
max_retries: int = 3
Overriding Configuration via CLI
You can override the model and LoRA adapter settings using command-line arguments (see CLI Usage above) without modifying the code:
# Use a different model
uv run auto-swagger --repo-path /path/to/repo --model "google/gemma-2-2b-it"
# Disable LoRA adapter
uv run auto-swagger --repo-path /path/to/repo --lora-adapter none
# Use custom model and adapter
uv run auto-swagger --repo-path /path/to/repo \
--model "google/gemma-2-2b-it" \
--lora-adapter "username/my-adapter"
Advanced Configuration
For advanced configuration (temperature, top_k, top_p, etc.), you can modify src/auto_swagger/swagger_generator/generator_config.py.
Future Improvements
- Support for additional backend frameworks beyond Express.js
- Local CLI version without GitHub app dependency
- Enhanced pattern recognition
- Additional documentation formats
- Real-time documentation updates
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file auto_swagger-0.1.0.tar.gz.
File metadata
- Download URL: auto_swagger-0.1.0.tar.gz
- Upload date:
- Size: 386.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.15 {"installer":{"name":"uv","version":"0.9.15","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4fccffcc1984b110e325340548a3590764684877f80b52b4d47a03fee1e561e1
|
|
| MD5 |
e3a2b85004a7c7c32ffb293eaf856490
|
|
| BLAKE2b-256 |
aeaea682bddf52f6a56994c95099f4f862b31f6aeef4a95125cbe235734a21c7
|
File details
Details for the file auto_swagger-0.1.0-py3-none-any.whl.
File metadata
- Download URL: auto_swagger-0.1.0-py3-none-any.whl
- Upload date:
- Size: 35.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.15 {"installer":{"name":"uv","version":"0.9.15","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b9f8993a9a075932c93414f7f4f19f8fd76754ed48d5fbb06187703e8b310205
|
|
| MD5 |
4a627adec4bfc36a30a024a6f3d7655b
|
|
| BLAKE2b-256 |
1528d35ce7d45b1332f91bfb5deaa71a49c6f48bb41f717957598e25a305f356
|