Skip to main content

The Mighty Monitor Trainer for your pytorch models.

Project description

pytorch-mighty

CircleCI Documentation Status

The Mighty Monitor Trainer for your pytorch models. Powered by Visdom.

Documentation: https://pytorch-mighty.readthedocs.io/en/latest/

Installation

Requires Python 3.6+

  1. Install PyTorch:
    • CPU backend: conda install pytorch torchvision cpuonly -c pytorch
    • GPU backend: conda install pytorch torchvision cudatoolkit=10.2 -c pytorch
  2. $ pip install pytorch-mighty

Quick start

Before running any script, start Visdom server:

$ python -m visdom.server -port 8097

Then run python examples.py or use the code below:

import torch
import torch.nn as nn
from torchvision import transforms
from torchvision.datasets import MNIST

from mighty.models import MLP
from mighty.monitor.monitor import MonitorLevel
from mighty.trainer import TrainerGrad
from mighty.utils.data import DataLoader

model = MLP(784, 128, 10)

optimizer = torch.optim.Adam(model.parameters(), lr=1e-3)
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer)

data_loader = DataLoader(MNIST, transform=transforms.ToTensor())

trainer = TrainerGrad(model,
                      criterion=nn.CrossEntropyLoss(),
                      data_loader=data_loader,
                      optimizer=optimizer,
                      scheduler=scheduler)
# trainer.restore()  # uncomment to restore the saved state
trainer.monitor.advanced_monitoring(level=MonitorLevel.SIGNAL_TO_NOISE)
trainer.train(n_epochs=10, mutual_info_layers=0)

Finally, navigate to http://localhost:8097 to see the training progress.

Articles, implemented or reused in the package

  1. Fong, R. C., & Vedaldi, A. (2017). Interpretable explanations of black boxes by meaningful perturbation.

  2. Belghazi, M. I., Baratin, A., Rajeswar, S., Ozair, S., Bengio, Y., Courville, A., & Hjelm, R. D. (2018). Mine: mutual information neural estimation.

  3. Kraskov, A., Stögbauer, H., & Grassberger, P. (2004). Estimating mutual information.

  4. Ince, R. A., Giordano, B. L., Kayser, C., Rousselet, G. A., Gross, J., & Schyns, P. G. (2017). A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula. Human brain mapping, 38(3), 1541-1573.

  5. IDTxl package to estimate mutual information.

Projects that use pytorch-mighty

  • MCMC_BinaryNet - Markov Chain Monte Carlo binary networks optimization.
  • EmbedderSDR - encode images into binary Sparse Distributed Representation (SDR).
  • sparse-representation - Basis Pursuit solvers for the P0- and P1-problems, which encode the data into sparse vectors of high dimensionality.
  • entropy-estimators - estimate Entropy and Mutual Information between multivariate random variables.

Check-out more examples on http://85.217.171.57:8097. Give your browser a few minutes to parse the json data.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-mighty-0.3.0.tar.gz (65.7 kB view details)

Uploaded Source

File details

Details for the file pytorch-mighty-0.3.0.tar.gz.

File metadata

  • Download URL: pytorch-mighty-0.3.0.tar.gz
  • Upload date:
  • Size: 65.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3.post20200330 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for pytorch-mighty-0.3.0.tar.gz
Algorithm Hash digest
SHA256 7d11a64f306097d333cdc8df8837c74c5fbe5f2ceebaa49ed29dbd967df368a9
MD5 e383217ba68c82f5816b11300a89bed0
BLAKE2b-256 f2ca7561a8f726e073661a7b11faf5a09c92defb2204964ebdd115ebcc6735fc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page