FastAPI project scaffolding tool
Project description
FastScaff
FastAPI project scaffolding tool - quickly generate standardized FastAPI project structures.
Installation
pip install fastscaff
Commands
Create Project
fastscaff new myproject --orm sqlalchemy
Options:
--orm- ORM choice:sqlalchemyortortoise(default:tortoise)--output- Output directory (default: current directory)--with-rbac- Include Casbin RBAC support--with-celery- Include Celery task queue support--force- Overwrite existing directory
Examples:
# Basic project with SQLAlchemy
fastscaff new myproject --orm sqlalchemy
# Full-featured project
fastscaff new myproject --orm sqlalchemy --with-celery --with-rbac
# Specify output directory
fastscaff new myproject --output /path/to/dir
Generate Models from Database
Generate ORM models by introspecting existing MySQL database tables:
cd myproject
fastscaff models --db-url "mysql://user:pass@localhost:3306/mydb"
Options:
--db-url- Database connection URL (required)--orm- Target ORM:sqlalchemyortortoise(auto-detected from requirements.txt)--tables- Comma-separated table names (default: all tables)--output- Output directory (default: current directory)
Examples:
# In project directory - ORM is auto-detected
cd myproject
fastscaff models --db-url "mysql://root:password@localhost:3306/mydb"
# Generate models for specific tables
fastscaff models --db-url "mysql://..." --tables user,order,product
# Explicitly specify ORM
fastscaff models --db-url "mysql://..." --orm tortoise
Generated models include:
- Field type mapping
- Primary keys and auto-increment
- Indexes
- Foreign key relationships
- Table and column comments
Project Structure
myproject/
├── app/
│ ├── main.py # Application entry point
│ ├── core/
│ │ ├── config.py # Settings (env-based)
│ │ ├── database.py # Database connection
│ │ ├── redis.py # Redis client
│ │ ├── security.py # Password hashing, JWT
│ │ ├── logger.py # Structured logging
│ │ └── lifespan.py # Startup/shutdown events
│ ├── api/v1/
│ │ ├── router.py # API router
│ │ └── endpoints/ # Route handlers
│ ├── models/ # ORM models
│ ├── schemas/ # Pydantic schemas
│ ├── repositories/ # Data access layer
│ ├── services/ # Business logic layer
│ ├── middleware/ # Request/response middleware
│ ├── exceptions/ # Custom exceptions
│ └── utils/ # Utility functions
├── tests/
├── .env.example
├── Dockerfile
├── docker-compose.yml
├── Makefile
└── requirements.txt
Architecture
The generated project follows a layered architecture:
| Layer | Directory | Responsibility |
|---|---|---|
| API | api/ |
HTTP handling, request validation, response formatting |
| Service | services/ |
Business logic, orchestration |
| Repository | repositories/ |
Data access, database queries |
| Model | models/ |
Database table definitions |
| Schema | schemas/ |
Request/response data structures |
Services are accessed via a singleton registry pattern:
from app.services import registry
user = await registry.user_service.get_user_by_id(user_id)
Built-in Features
Middleware
- CORS handling
- Request logging with trace ID
- JWT authentication
- Security headers
- Request signing verification
Utilities
- Snowflake ID generator
- Rate limiter (Redis-based)
- Cache decorator
- Password hashing
Database
- SQLite by default (zero configuration)
- MySQL/PostgreSQL ready (just update
DATABASE_URL) - Async database operations
- Request-scoped sessions (SQLAlchemy)
Running the Project
cd myproject
pip install -r requirements.txt
make dev
The project runs immediately with SQLite - no database setup required.
Available make commands:
make dev # Start development server
make test # Run tests
make lint # Run linter
make format # Format code
make docker-up # Start all services (Docker)
make docker-down # Stop all services
If Celery is enabled:
make celery-worker # Start Celery worker
make celery-beat # Start Celery beat scheduler
Configuration
Configuration is managed via environment variables. Copy .env.example to .env:
# Application
ENV=dev
DEBUG=true
PORT=8000
# Database
DATABASE_URL=sqlite+aiosqlite:///./app.db
# DATABASE_URL=mysql+aiomysql://user:pass@localhost:3306/mydb
# Redis
REDIS_URL=redis://localhost:6379/0
# JWT
JWT_SECRET_KEY=your-secret-key
JWT_ACCESS_TOKEN_EXPIRE_MINUTES=30
# Celery (if enabled)
CELERY_BROKER_URL=redis://localhost:6379/1
CELERY_RESULT_BACKEND=redis://localhost:6379/1
Development
# Clone the repository
git clone https://github.com/lee-hangzhou/fastscaff.git
cd fastscaff
# Install in development mode
pip install -e ".[dev]"
# Run tests
pytest
# Code formatting
ruff check --fix .
ruff format .
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fastscaff-0.2.3.tar.gz.
File metadata
- Download URL: fastscaff-0.2.3.tar.gz
- Upload date:
- Size: 44.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4d935f87f826a5f35b92c944d621a5b68401cf2783d6d92e0243555bf4c5c92f
|
|
| MD5 |
2bc799b03ab8f3f5602711fc64e5674f
|
|
| BLAKE2b-256 |
5c0ea2af8b53fe1fbf26bcd3febdf360f0f5f8f0795c66ae5d4ffdf8ad377fac
|
File details
Details for the file fastscaff-0.2.3-py3-none-any.whl.
File metadata
- Download URL: fastscaff-0.2.3-py3-none-any.whl
- Upload date:
- Size: 65.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6f16538769c2ce2599f6d3d7106a33fdd777ca8f576c008e23e0cdf6f4c11db3
|
|
| MD5 |
658075ed2c91e4809c3b9d58c18c33a9
|
|
| BLAKE2b-256 |
99680401580baf02a6e784c94c0addd9a5683444bb2f709800a50ceb1ea151cc
|