Skip to main content

Interactive CLI generator for production-ready FastAPI AI stacks

Project description

fastai-stack

Build production-ready, AI-first FastAPI backends in minutes.

fastai-stack is an interactive CLI generator for modern Python teams that want strongly typed APIs, async-first architecture, and built-in paths for LLM apps, embeddings, background jobs, observability, and container deployment.

Why fastai-stack

  • AI-native backend scaffolding, not generic boilerplate
  • FastAPI + Pydantic v2 conventions out of the box
  • Opinionated project structure for scale and maintainability
  • Interactive and non-interactive modes for both humans and CI
  • One command to scaffold full backend architecture

Core Features

  • Typer-based CLI with clean UX and smart defaults
  • Jinja2 template engine with conditional generation
  • Modular API layout (app/api/v1/endpoints/*)
  • Pydantic settings model for environment-driven config
  • Async-ready backend structure
  • Optional AI endpoints for chat and embeddings
  • Optional vector DB integration (pgvector, weaviate)
  • Optional task queue support (celery + redis)
  • Optional monitoring support (sentry, prometheus)
  • Docker + Docker Compose with CPU/GPU profiles
  • Poetry-based dependency management
  • Test + CI scaffolding included

Install

pip install fastai-stack

Quick Start

Interactive Mode

fastai-stack create myapp --interactive

Non-Interactive Mode

fastai-stack create myapp \
	--non-interactive \
	--db postgres \
	--auth jwt \
	--tasks celery+redis \
	--ai openai \
	--vector-db pgvector \
	--docker gpu \
	--monitoring prometheus \
	--frontend htmx+vite

CLI Options

fastai-stack create <project_name> [options]

  • --interactive / --non-interactive
  • --db: postgres | mongodb | sqlite
  • --auth: none | jwt | oauth2
  • --tasks: none | celery+redis
  • --ai: none | openai | langchain | huggingface
  • --vector-db: none | pgvector | weaviate
  • --docker: cpu | gpu
  • --monitoring: none | sentry | prometheus
  • --frontend: none | htmx+vite
  • --output-dir <path>

Generated Project Structure

<project_slug>/
|- app/
|  |- core/
|  |  |- config.py
|  |  |- deps.py
|  |  `- security.py
|  |- api/v1/endpoints/
|  |  |- health.py
|  |  `- ai/ (optional)
|  |- models/
|  |- schemas/
|  `- crud/
|- tests/
|- migrations/ (postgres)
|- docker-compose.yml
|- Dockerfile
|- .env.example
|- pyproject.toml
`- .github/workflows/ci.yml

AI Capabilities

  • OpenAI chat endpoint template with SSE token streaming
  • Embeddings endpoint starter template
  • Hugging Face sentence-transformers friendly defaults
  • LangChain-ready integration stubs

Dev Experience

  • Consistent settings loading through pydantic-settings
  • Clear separation of API, core, schemas, and crud layers
  • Ready-to-run local development setup
  • Starter tests for smoke validation

Local Development

poetry install
poetry run pytest
poetry run fastai-stack --help

Running a Generated App

Inside the generated project:

cp .env.example .env
poetry install
poetry run uvicorn app.main:app --reload

Open docs: http://127.0.0.1:8000/docs

Docker Usage

docker compose up --build

If you choose --docker gpu, the generated compose config includes NVIDIA device reservations.

Publishing This CLI

poetry build
poetry publish

Roadmap Direction

  • Richer auth blueprints (fastapi-users full wiring)
  • Production-ready DB model templates
  • More AI provider templates and eval-ready scaffolds
  • Full release automation for PyPI + GitHub Releases

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastai_stack-0.1.1.tar.gz (13.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fastai_stack-0.1.1-py3-none-any.whl (20.5 kB view details)

Uploaded Python 3

File details

Details for the file fastai_stack-0.1.1.tar.gz.

File metadata

  • Download URL: fastai_stack-0.1.1.tar.gz
  • Upload date:
  • Size: 13.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.10.10 Windows/10

File hashes

Hashes for fastai_stack-0.1.1.tar.gz
Algorithm Hash digest
SHA256 7f00127690b48a7d1752a9f9952d5476a17bde4e66d56c12da1b3ad14d9033d3
MD5 5ca89c742916df167586ba207ea175e9
BLAKE2b-256 111f80cc51a066eee3ce00831df223e38e2f80aa71d9ca942570f113403fde04

See more details on using hashes here.

File details

Details for the file fastai_stack-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: fastai_stack-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 20.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.10.10 Windows/10

File hashes

Hashes for fastai_stack-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7da7a3822da9bb032757c7507ae7f3c2ec9d3104a255b27796f7d1b1196aae15
MD5 563e4c51c9de7ad2ae80647ced7d90db
BLAKE2b-256 1dfc160b7d0209cf65efac8ef59eb5e26a69265e33cc151ac5153c58c6293640

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page