Pytorch & Lightning based framework for research and ml-pipeline automation.
Project description
LighTorch
A Pytorch and Lightning based framework for research and ml pipeline automation.
Framework
- $\text{Hyperparameter space}.$
- $\text{Genetic algorithms(single-objective/multi-objective)}$
- $\text{Best hyperparameters in config.yaml}$
- $\text{Training session}$
htuning.py
from lightorch.htuning.optuna import htuning
from ... import NormalModule
from ... import FourierVAE
def objective(trial) -> Dict[str, float]:
... # define hyperparameters
return hyperparameters
if __name__ == '__main__':
htuning(
model_class = FourierVAE,
hparam_objective = objective,
datamodule = NormalModule,
valid_metrics = [f"Training/{name}" for name in [
"Pixel",
"Perceptual",
"Style",
"Total variance",
"KL Divergence"]],
directions = ['minimize', 'minimize', 'minimize', 'minimize', 'minimize'],
precision = 'medium',
n_trials = 150,
)
exec: python3 -m htuning
config.yaml
trainer: # trainer arguments
logger: true
enable_checkpointing: true
max_epochs: 250
accelerator: cuda
devices: 1
precision: 32
model:
class_path: utils.FourierVAE #model relative path
dict_kwargs: #**hparams
encoder_lr: 2e-2
encoder_wd: 0
decoder_lr: 1e-2
decoder_wd: 0
alpha:
- 0.02
- 0.003
- 0.003
- 0.01
beta: 0.00001
optimizer: adam
data: # Dataset arguments
class_path: data.DataModule
init_args:
type_dataset: mnist
batch_size: 12
pin_memory: true
num_workers: 8
training.py
from lightorch.training.cli import trainer
if __name__ == '__main__':
trainer()
exec: python3 -m training -c config.yaml
Features
- Built in Module class for:
- Adversarial training.
- Supervised, Self-supervised training.
- Multi-Objective and Single-Objective optimization and Hyperparameter tuning with optuna.
Modules
- Fourier Convolution.
- Fourier Deconvolution.
- Partial Convolution. (Optimized implementation)
- Grouped Query Attention, Multi Query Attention, Multi Head Attention. (Interpretative usage) (with flash-attention option)
- Self Attention, Cross Attention.
- Normalization methods.
- Positional encoding methods.
- Embedding methods.
- Useful criterions.
- Useful utilities.
- Built-in Default Feed Forward Networks.
- Adaptation for $\mathbb{C}$ modules.
- Interpretative Deep Neural Networks.
- Monte Carlo forward methods.
Contact
Citation
@misc{lightorch,
author = {Jorge Enciso},
title = {LighTorch: Automated Deep Learning framework for researchers},
howpublished = {\url{https://github.com/Jorgedavyd/LighTorch}},
year = {2024}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lightorch-0.0.6.tar.gz
(30.1 kB
view details)
Built Distribution
lightorch-0.0.6-py3-none-any.whl
(36.3 kB
view details)
File details
Details for the file lightorch-0.0.6.tar.gz
.
File metadata
- Download URL: lightorch-0.0.6.tar.gz
- Upload date:
- Size: 30.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c4f23f03375db4b4b9f711fe22e82e56a701da6e70a36e64162fec254815b533 |
|
MD5 | dfec048caeaf4c0932b3452f78b13aed |
|
BLAKE2b-256 | 48a8447787f9e44e2638942d06352fba64c9f456e2dc61c1664aaa914e185ad9 |
File details
Details for the file lightorch-0.0.6-py3-none-any.whl
.
File metadata
- Download URL: lightorch-0.0.6-py3-none-any.whl
- Upload date:
- Size: 36.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5867b534a9ff808a84d4a60852e07f4c10a74721e834a1c14802b32bbbc45729 |
|
MD5 | 35266e79ca15d684a453561842e42130 |
|
BLAKE2b-256 | 61ff90187056805469ec4299c4b2da7cdab2f275540eca9c28761e54702f2b25 |