Universal API documentation generator
Project description
Apimatic 🚀
A tool to automatically generate beautiful and comprehensive API documentation (Markdown/OpenAPI) from your source code.
Supports Flask, FastAPI, and more frameworks with optional AI-powered enhancements via Ollama.
📦 Installation
pip install Apimatic
Upgrade to the latest version:
pip install --upgrade Apimatic
⚡ Usage
Apimatic [-h] [--src SRC] [--framework [FRAMEWORK ...]] [--format {markdown,openapi}]
[--output OUTPUT] [--use-ollama] [--model MODEL]
🔑 Options
| Option | Description |
|---|---|
-h, --help |
Show help message and exit |
--src SRC |
Root directory of the project to scan (Default: current directory) |
--framework [FRAMEWORK ...] |
Force a specific framework (flask, fastapi, etc.). If omitted, auto-detected |
--format {markdown,openapi} |
Output format (markdown or openapi) – Default: markdown |
--output OUTPUT |
Path for the generated output file (Default: API_Docs.md or openapi.yaml) |
--use-ollama |
Enhance generated docs with descriptions from a local Ollama model |
--model MODEL |
Ollama model for enhancement (e.g., llama3:instruct). Requires --use-ollama |
📝 Examples
Generate Markdown docs from the current project:
Apimatic --src . --format markdown --output API_Docs.md
Generate OpenAPI spec:
Apimatic --src . --format openapi --output openapi.yaml
Force framework detection (Flask):
Apimatic --src ./my_flask_app --framework flask
Enhance documentation with AI (Ollama model):
Apimatic --src . --use-ollama --model llama3.2:1b
🤖 Recommended Ollama Models (1–2 GB)
When using --use-ollama, you can choose a local model for API explanations.
Here are lightweight models that run well (1–2 GB range):
| Model | Size | Why Use It |
|---|---|---|
llama3.2:1b |
~1.3GB | Fast, nimble, and great for generating clear API explanations (recommended) |
gemma2:2b |
~1.6GB | Slightly larger, richer outputs, good balance of quality and size |
dolphin-phi |
~1.6GB | Alternative small model with solid reasoning ability |
orca-mini |
~1.9GB | Bigger (3B params) but still under 2GB; more context-aware |
moondream2 |
~0.8GB | Ultra-light, very fast, but less detailed |
👉 Recommended Default: llama3.2:1b – best speed + clarity tradeoff.
Example:
Apimatic --src . --use-ollama --model llama3.2:1b
🤝 Contributing
Contributions are welcome! Please fork the repo, make your changes, and submit a PR.
📄 License
This project is licensed under the MIT License.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file apimatic-0.1.5.tar.gz.
File metadata
- Download URL: apimatic-0.1.5.tar.gz
- Upload date:
- Size: 11.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
23f7436314fdaeafe188345d5e432a704c5bae7c8fe44a73c91aeb50e082c223
|
|
| MD5 |
521afbd38c67ce3d0ba1992e903f80a6
|
|
| BLAKE2b-256 |
160489eac1e2059851e4e65171faeaffd8761e8e8796caba033752d5eef98d3a
|
File details
Details for the file apimatic-0.1.5-py3-none-any.whl.
File metadata
- Download URL: apimatic-0.1.5-py3-none-any.whl
- Upload date:
- Size: 16.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cf9303be6aab285803a8cf37cea78d471c31188ba18287d8172f1091e3a9231f
|
|
| MD5 |
c8d9d1049a5d50f8ee8ec1716ca3ed8c
|
|
| BLAKE2b-256 |
f99d7403fff4da8538631829667f1c541738e2ee0a7ed9f05cdbf4cc11be026c
|