Skip to main content

Pre-flight checks for PyTorch pipelines. Catch silent failures before they waste your GPU.

Project description

preflight

Pre-flight checks for PyTorch pipelines. Catch silent failures before they waste your GPU.

CI PyPI version Python License: MIT


Most deep learning bugs don't crash your training loop — they silently produce a garbage model. NaNs in your data, labels leaking between train and val, wrong channel ordering, dead gradients. You won't know until hours later, after the GPU bill has landed.

preflight is a pre-training validation tool you run in 30 seconds before starting any training job. It's not a linter. It's a pre-flight check — the kind pilots run before the expensive thing takes off.


Install

pip install preflight-ml

Quickstart

Create a small Python file that exposes your dataloader:

# my_dataloader.py
import torch
from torch.utils.data import DataLoader, TensorDataset

x = torch.randn(200, 3, 224, 224)
y = torch.randint(0, 10, (200,))
dataloader = DataLoader(TensorDataset(x, y), batch_size=32)

Run preflight:

preflight run --dataloader my_dataloader.py

Output:

preflight — pre-training check report
╭────────────────────────┬──────────┬────────┬──────────────────────────────────────────────────╮
│ Check                  │ Severity │ Status │ Message                                          │
├────────────────────────┼──────────┼────────┼──────────────────────────────────────────────────┤
│ nan_inf_detection      │ FATAL    │ PASS   │ No NaN or Inf values found in 10 sampled batches │
│ normalisation_sanity   │ WARN     │ PASS   │ Normalisation looks reasonable (mean=0.001)      │
│ channel_ordering       │ WARN     │ PASS   │ Channel ordering looks correct (NCHW)            │
│ label_leakage          │ FATAL    │ PASS   │ No val_dataloader provided — skipped             │
│ split_sizes            │ INFO     │ PASS   │ train=200 samples                                │
│ vram_estimation        │ WARN     │ INFO   │ No CUDA GPU detected — skipped                   │
│ class_imbalance        │ WARN     │ PASS   │ Class distribution looks balanced                │
│ shape_mismatch         │ FATAL    │ PASS   │ No model provided — skipped                      │
│ gradient_check         │ FATAL    │ PASS   │ No model+loss provided — skipped                 │
╰────────────────────────┴──────────┴────────┴──────────────────────────────────────────────────╯

  0 fatal  0 warnings  9 passed

Pre-flight passed. Safe to start training.

Checks

preflight runs 10 checks across three severity tiers. A FATAL failure exits with code 1 and blocks CI.

Check Severity What it catches
nan_inf_detection FATAL NaN or Inf values anywhere in sampled batches
label_leakage FATAL Samples appearing in both train and val sets
shape_mismatch FATAL Dataset output shape incompatible with model input
gradient_check FATAL Zero gradients, dead layers, exploding gradients
normalisation_sanity WARN Data that looks unnormalised (raw pixel values etc.)
channel_ordering WARN NHWC tensors when PyTorch expects NCHW
vram_estimation WARN Estimated peak VRAM exceeds 90% of GPU memory
class_imbalance WARN Severe class imbalance beyond configurable threshold
split_sizes INFO Empty or degenerate train/val splits
duplicate_samples INFO Identical samples within a split

With a model

Pass a model file to enable shape, gradient, and VRAM checks:

# my_model.py
import torch.nn as nn
model = nn.Sequential(nn.Flatten(), nn.Linear(3 * 224 * 224, 10))
# my_loss.py
import torch.nn as nn
loss_fn = nn.CrossEntropyLoss()
preflight run \
  --dataloader my_dataloader.py \
  --model my_model.py \
  --loss my_loss.py \
  --val-dataloader my_val_dataloader.py

Configuration

Add a .preflight.toml to your repo root to configure thresholds and disable checks:

[thresholds]
imbalance_threshold = 0.05
nan_sample_batches = 20

[checks]
vram_estimation = false

[ignore]
# check = "class_imbalance"
# reason = "intentional: rare event dataset"

CI integration

Add to your GitHub Actions workflow:

- name: Install preflight
  run: pip install preflight-ml

- name: Run pre-flight checks
  run: preflight run --dataloader scripts/dataloader.py --format json

The --format json flag outputs machine-readable results. Exit code is 1 if any FATAL check fails, 0 otherwise.

List all checks

preflight checks

What preflight does NOT do

  • It does not replace unit tests. Use pytest for code logic.
  • It does not guarantee a correct model. Passing preflight is a minimum safety bar, not a certification.
  • It does not run your full training loop. Use it as a gate before training starts.
  • It does not modify your code unless you pass --fix.

Roadmap

  • --fix flag — auto-patch common issues (channel ordering, normalisation)
  • Dataset snapshot + drift detection (preflight diff baseline.json new_data.pt)
  • Full dry-run mode (one batch through model + loss + backward)
  • Jupyter magic command (%load_ext preflight)
  • preflight-monai plugin for medical imaging checks
  • preflight-sktime plugin for time series checks

Contributing

See CONTRIBUTING.md. New checks are welcome — each one needs a passing test, a failing test, and a fix hint.

License

MIT — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

preflight_ml-0.1.0.tar.gz (14.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

preflight_ml-0.1.0-py3-none-any.whl (16.5 kB view details)

Uploaded Python 3

File details

Details for the file preflight_ml-0.1.0.tar.gz.

File metadata

  • Download URL: preflight_ml-0.1.0.tar.gz
  • Upload date:
  • Size: 14.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for preflight_ml-0.1.0.tar.gz
Algorithm Hash digest
SHA256 69a419516c585279fcafbd9f300062335ee94eb50ff230bc53e3402b60ebccbb
MD5 3f0615b0468fa19b60277466b3d4a980
BLAKE2b-256 858b0a4931ca0119f79cf4f91490212d8f2e683cc277ebc9830193a6e1338934

See more details on using hashes here.

File details

Details for the file preflight_ml-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: preflight_ml-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 16.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for preflight_ml-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4d26acb15876ee58fcb98dfa7ce3ca75dbbce5e89c1aeb23da4638bb628c9d35
MD5 7b82cb8947a77864e8011af2c9c0b7bd
BLAKE2b-256 71a4ff1c85482a56fe6004e0c90c7df56f1f0c746a117df5bdda1c0d90308746

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page