Skip to main content

YAML-based automated rapid prototyping framework for deep learning experiments

Project description

Lighter logo


CI Coverage PyPI License Documentation Discord


Lighter makes PyTorch Lightning experiments reproducible and composable through YAML configuration. Stop hardcoding hyperparameters—configure everything from the command line.

Why Lighter?

You're already using PyTorch Lightning. But every experiment requires editing Python code to change hyperparameters:

# Want to try a different learning rate? Edit the code.
optimizer = Adam(params, lr=0.001)  # Change this line

# Want to use a different batch size? Edit the code.
train_loader = DataLoader(dataset, batch_size=32)  # And this one

# Want to train longer? Edit the code again.
trainer = Trainer(max_epochs=10)  # And this one too

With Lighter, configure everything in YAML and override from the CLI:

# Try different learning rates without touching code
lighter fit config.yaml model::optimizer::lr=0.001
lighter fit config.yaml model::optimizer::lr=0.01
lighter fit config.yaml model::optimizer::lr=0.1

# Every experiment is reproducible - just version control your configs

Quick Start

pip install lighter

Use your existing PyTorch Lightning code:

# model.py
import torch
import torch.nn.functional as F
import pytorch_lightning as pl

class MyModel(pl.LightningModule):
    def __init__(self, network, learning_rate=0.001):
        super().__init__()
        self.network = network
        self.lr = learning_rate

    def training_step(self, batch, batch_idx):
        x, y = batch
        loss = F.cross_entropy(self.network(x), y)
        self.log("train/loss", loss)
        return loss

    def configure_optimizers(self):
        return torch.optim.Adam(self.parameters(), lr=self.lr)

Configure in YAML instead of hardcoding:

# config.yaml
trainer:
  _target_: pytorch_lightning.Trainer
  max_epochs: 10

model:
  _target_: model.MyModel
  network:
    _target_: torchvision.models.resnet18
    num_classes: 10
  learning_rate: 0.001

data:
  _target_: lighter.LighterDataModule
  train_dataloader:
    _target_: torch.utils.data.DataLoader
    batch_size: 32
    dataset:
      _target_: torchvision.datasets.CIFAR10
      root: ./data
      train: true
      download: true

Run and iterate fast:

# Run your experiment
lighter fit config.yaml

# Try different hyperparameters - no code editing needed
lighter fit config.yaml model::learning_rate=0.01
lighter fit config.yaml trainer::max_epochs=50
lighter fit config.yaml data::train_dataloader::batch_size=64

# Use multiple GPUs
lighter fit config.yaml trainer::devices=4

# Every run creates timestamped outputs with saved configs
# outputs/2025-11-21/14-30-45/config.yaml  # Fully reproducible

Key Benefits

  • Reproducible: Every experiment = one YAML file. Version control configs like code.
  • Fast iteration: Override any parameter from CLI without editing code.
  • Zero lock-in: Works with any PyTorch Lightning module. Your code, your logic.
  • Composable: Merge configs, create recipes, share experiments as files.
  • Organized: Automatic timestamped output directories with saved configs.
  • Simple: ~500 lines of code. Read the framework in 30 minutes.

Optional: Use LighterModule for Less Boilerplate

If you want automatic optimizer configuration and dual logging (step + epoch), use LighterModule:

from lighter import LighterModule

class MyModel(LighterModule):
    def training_step(self, batch, batch_idx):
        x, y = batch
        pred = self(x)
        loss = self.criterion(pred, y)

        if self.train_metrics:
            self.train_metrics(pred, y)

        return {"loss": loss}  # Framework logs automatically

    # validation_step, test_step, predict_step...
model:
  _target_: model.MyModel
  network:
    _target_: torchvision.models.resnet18
    num_classes: 10
  criterion:
    _target_: torch.nn.CrossEntropyLoss
  optimizer:
    _target_: torch.optim.Adam
    params: "$@model::network.parameters()"
    lr: 0.001
  train_metrics:
    - _target_: torchmetrics.Accuracy
      task: multiclass
      num_classes: 10

LighterModule gives you:

  • Automatic configure_optimizers() handling
  • Automatic dual logging (step + epoch)
  • Config-driven criterion and metrics

But you still control:

  • All step implementations
  • Loss computation logic
  • When to call metrics

Example: Running a Hyperparameter Sweep

# Run grid search without editing code
for lr in 0.001 0.01 0.1; do
  for bs in 32 64 128; do
    lighter fit config.yaml \
      model::optimizer::lr=$lr \
      data::train_dataloader::batch_size=$bs
  done
done

# Each run saved in outputs/YYYY-MM-DD/HH-MM-SS/ with config.yaml
# Compare experiments by diffing configs

Documentation

Real-World Usage

Community

Citation

If Lighter helps your research, please cite our JOSS paper:

@article{lighter,
    doi = {10.21105/joss.08101},
    year = {2025}, publisher = {The Open Journal}, volume = {10}, number = {111}, pages = {8101},
    author = {Hadzic, Ibrahim and Pai, Suraj and Bressem, Keno and Foldyna, Borek and Aerts, Hugo JWL},
    title = {Lighter: Configuration-Driven Deep Learning},
    journal = {Journal of Open Source Software}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lighter-0.1.0.tar.gz (25.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lighter-0.1.0-py3-none-any.whl (30.8 kB view details)

Uploaded Python 3

File details

Details for the file lighter-0.1.0.tar.gz.

File metadata

  • Download URL: lighter-0.1.0.tar.gz
  • Upload date:
  • Size: 25.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.16 {"installer":{"name":"uv","version":"0.9.16","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for lighter-0.1.0.tar.gz
Algorithm Hash digest
SHA256 e5ef474322875714b409eefa786072226ddf29ba590c2c9e011c9b8814bb9734
MD5 7abb595663dde462e5404986030d5cc6
BLAKE2b-256 5d01e494552efb09e5a977ba215b0677ee079d6207beb8a170d7f8bbc1e637df

See more details on using hashes here.

File details

Details for the file lighter-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: lighter-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 30.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.16 {"installer":{"name":"uv","version":"0.9.16","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for lighter-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4924b7fa5f7352f9733a02bd04c05c3e25472359ec1e892de8c00b4233b3fe35
MD5 30b84d446b6da430a30310f883a00508
BLAKE2b-256 087e307ee90fc59e5c0f9530cfbba9c0d25e9efe9109e4b1a2a5af2481edcb56

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page