A catalyst for experiments with lightning.
Project description
Pytorch Mjolnir
Thinking in experiments made simpler for pytorch-lightning. So that no experiment is wasted and you can iterate faster.
On pypi some links are broken -> Visit the project on github.
Getting Started
Simply pip install
pip install pytorch_mjolnir
Then read the Documentation of its API containing also examples.
Writing an Experiment
When you already have your code split into a model and a loss, it is easy to convert it to a mjolnir experiment. Simply use the SupervisedExperiment experiment or any of the precanned experiments. (Learn more in the documentation)
from torch.optim import Adam
from torchvision.datasets import MNIST
from torchvision import transforms
from torch.utils.data import random_split
from pytorch_mjolnir import SupervisedExperiment
class MNISTExperiment(SupervisedExperiment):
def __init__(self, learning_rate=1e-3, batch_size=32):
super().__init__()
self.save_hyperparameters()
self.model = MyModel() # Any old pytorch model -> preds = self.model(*features)
self.loss = MyLoss() # Any loss -> loss = self.loss(preds, targets)
def prepare_data(self):
# Prepare the data once (no state allowed due to multi-gpu/node setup.)
MNIST(".datasets", train=True, download=True)
def load_data(self, stage=None) -> Tuple[Any, Any]:
# Load your datasets.
dataset = MNIST(".datasets", train=True, download=False, transform=transforms.ToTensor())
return random_split(dataset, [55000, 5000])
def configure_optimizers(self):
# Create an optimizer to your liking.
return Adam(self.parameters(), lr=self.hparams.learning_rate)
# Run the experiment when the script is executed.
if __name__ == "__main__":
from pytorch_mjolnir import run
run(MNISTExperiment)
Running an experiment
Local: Simply run your experiment py file from the command line. It has lots of parameters to customize its behaviour.
python examples/autoencoder.py --name=Autoencoder
Remote/SLURM: In a cluster setting check out how the remote run command works, it might make you much more productive. You simply specify a run.template.slurm
and a run.template.sh
(see examples).
mjolnir_remote examples/autoencoder.py --name=Autoencoder --host=slurm.yourcompany.com
Contributing
Currently there are no guidelines on how to contribute, so the best thing you can do is open up an issue and get in contact that way. In the issue we can discuss how you can implement your new feature or how to fix that nasty bug.
To contribute, please fork the repositroy on github, then clone your fork. Make your changes and submit a merge request.
Origin of the Name
Mjolnir is thors weapon and a catalyst for lightning. As this library is about being a catalyst for experiments with lightning, the choice.
License
This repository is under MIT License. Please see the full license here.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pytorch_mjolnir-0.1.0.tar.gz
.
File metadata
- Download URL: pytorch_mjolnir-0.1.0.tar.gz
- Upload date:
- Size: 8.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.56.2 CPython/3.9.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fec496ff3fd81b26c269a0f646653b68de5067983a99e0aa385f7e354b5a836f |
|
MD5 | 502b399b323fd68e988e29d22c30a9aa |
|
BLAKE2b-256 | 7db42611ecd90eb9df6448b6239bce035d8f2fb63366625b1bed71f9a94fa78c |
File details
Details for the file pytorch_mjolnir-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: pytorch_mjolnir-0.1.0-py3-none-any.whl
- Upload date:
- Size: 9.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.56.2 CPython/3.9.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bcd27b976ad4ebd4c7038ee7dc81f2c442b3a3153f182d50e62b74241b82b9ad |
|
MD5 | dc3e0cbbd824633c9af6628624deee9c |
|
BLAKE2b-256 | f0ac4aaf4bd57ba38931991ad45fbed43cd3b0834a2eed42380afb2043fbe00a |