Skip to main content

Pytorch & Lightning based framework for research and ml-pipeline automation.

Project description

status pypi CI CD license code-style

LighTorch

A Pytorch and Lightning based framework for research and ml pipeline automation.

Framework

  1. $\text{Hyperparameter space}.$
  2. $\text{Genetic algorithms(single-objective/multi-objective)}$
  3. $\text{Best hyperparameters in config.yaml}$
  4. $\text{Training session}$

htuning.py

from lightorch.htuning.optuna import htuning
from ... import NormalModule
from ... import FourierVAE

def objective(trial) -> Dict[str, float]:
    ... # define hyperparameters
    return hyperparameters

if __name__ == '__main__':
    htuning(
        model_class = FourierVAE,
        hparam_objective = objective,
        datamodule = NormalModule,
        valid_metrics = [f"Training/{name}" for name in [
            "Pixel",
            "Perceptual",
            "Style",
            "Total variance",
            "KL Divergence"]],
        directions = ['minimize', 'minimize', 'minimize', 'minimize', 'minimize'],
        precision = 'medium',
        n_trials = 150,
    )

exec: python3 -m htuning

config.yaml

trainer: # trainer arguments
  logger: true 
  enable_checkpointing: true
  max_epochs: 250
  accelerator: cuda
  devices:  1
  precision: 32
  
model:
  class_path: utils.FourierVAE #model relative path
  dict_kwargs: #**hparams
    encoder_lr: 2e-2
    encoder_wd: 0
    decoder_lr: 1e-2
    decoder_wd: 0
    alpha:
      - 0.02
      - 0.003
      - 0.003
      - 0.01
    beta: 0.00001
    optimizer: adam

data: # Dataset arguments
  class_path: data.DataModule
  init_args:
    type_dataset: mnist 
    batch_size: 12
    pin_memory: true
    num_workers: 8

training.py

from lightorch.training.cli import trainer

if __name__ == '__main__':
    trainer()

exec: python3 -m training -c config.yaml

Features

  • Built in Module class for:
    • Adversarial training.
    • Supervised, Self-supervised training.
  • Multi-Objective and Single-Objective optimization and Hyperparameter tuning with optuna.

Modules

  • Fourier Convolution.
  • Fourier Deconvolution.
  • Partial Convolution. (Optimized implementation)
  • Grouped Query Attention, Multi Query Attention, Multi Head Attention. (Interpretative usage) (with flash-attention option)
  • Self Attention, Cross Attention.
  • Normalization methods.
  • Positional encoding methods.
  • Embedding methods.
  • Useful criterions.
  • Useful utilities.
  • Built-in Default Feed Forward Networks.
  • Adaptation for $\mathbb{C}$ modules.
  • Interpretative Deep Neural Networks.
  • Monte Carlo forward methods.

Contact

Citation

@misc{lightorch,
  author = {Jorge Enciso},
  title = {LighTorch: Automated Deep Learning framework for researchers},
  howpublished = {\url{https://github.com/Jorgedavyd/LighTorch}},
  year = {2024}
}

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lightorch-0.0.4.tar.gz (29.9 kB view hashes)

Uploaded Source

Built Distribution

lightorch-0.0.4-py3-none-any.whl (36.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page