Pytorch & Lightning based framework for research and ml-pipeline automation.
Project description
LighTorch
A Pytorch and Lightning based framework for research and ml pipeline automation.
Framework
- $\text{Hyperparameter space}.$
- $\text{Genetic algorithms(single-objective/multi-objective)}$
- $\text{Best hyperparameters in config.yaml}$
- $\text{Training session}$
htuning.py
from lightorch.htuning.optuna import htuning
from ... import NormalModule
from ... import FourierVAE
def objective(trial) -> Dict[str, float]:
... # define hyperparameters
return hyperparameters
if __name__ == '__main__':
htuning(
model_class = FourierVAE,
hparam_objective = objective,
datamodule = NormalModule,
valid_metrics = [f"Training/{name}" for name in [
"Pixel",
"Perceptual",
"Style",
"Total variance",
"KL Divergence"]],
directions = ['minimize', 'minimize', 'minimize', 'minimize', 'minimize'],
precision = 'medium',
n_trials = 150,
)
exec: python3 -m htuning
config.yaml
trainer: # trainer arguments
logger: true
enable_checkpointing: true
max_epochs: 250
accelerator: cuda
devices: 1
precision: 32
model:
class_path: utils.FourierVAE #model relative path
dict_kwargs: #**hparams
encoder_lr: 2e-2
encoder_wd: 0
decoder_lr: 1e-2
decoder_wd: 0
alpha:
- 0.02
- 0.003
- 0.003
- 0.01
beta: 0.00001
optimizer: adam
data: # Dataset arguments
class_path: data.DataModule
init_args:
type_dataset: mnist
batch_size: 12
pin_memory: true
num_workers: 8
training.py
from lightorch.training.cli import trainer
if __name__ == '__main__':
trainer()
exec: python3 -m training -c config.yaml
Features
- Built in Module class for:
- Adversarial training.
- Supervised, Self-supervised training.
- Multi-Objective and Single-Objective optimization and Hyperparameter tuning with optuna.
Modules
- Fourier Convolution.
- Fourier Deconvolution.
- Partial Convolution. (Optimized implementation)
- Grouped Query Attention, Multi Query Attention, Multi Head Attention. (Interpretative usage) (with flash-attention option)
- Self Attention, Cross Attention.
- Normalization methods.
- Positional encoding methods.
- Embedding methods.
- Useful criterions.
- Useful utilities.
- Built-in Default Feed Forward Networks.
- Adaptation for $\mathbb{C}$ modules.
- Interpretative Deep Neural Networks.
- Monte Carlo forward methods.
Contact
Citation
@misc{lightorch,
author = {Jorge Enciso},
title = {LighTorch: Automated Deep Learning framework for researchers},
howpublished = {\url{https://github.com/Jorgedavyd/LighTorch}},
year = {2024}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lightorch-0.0.4.tar.gz
(29.9 kB
view hashes)
Built Distribution
lightorch-0.0.4-py3-none-any.whl
(36.1 kB
view hashes)
Close
Hashes for lightorch-0.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | fe515581c02acb2ae9a832e853c7365849c6de5dff4a4f83cef8d43b6e7a30b5 |
|
MD5 | c4842ccb7d657eed2455e82cb1d1aaae |
|
BLAKE2b-256 | a2263447752853a84901e2dd030d5f8a1090abfbe1b4323925953bd253a91543 |