Skip to main content

A pluggable & extensible trainer for pytorch

Project description

Torchero - A training framework for pytorch

Features

  • Train/validate models for given number of epochs
  • Hooks/Callbacks to add personalized behavior
  • Different metrics of model accuracy/error
  • Training/validation statistics monitors
  • Cross fold validation iterators for splitting validation data from train data

Example

Training with MNIST

import torch
from torch import nn
from torch.utils.data import DataLoader
from torch import optim
import torchvision
from torchvision.datasets import MNIST
from torchvision import transforms
import torchero
from torchero import SupervisedTrainer
from torchero.meters import CategoricalAccuracy
from torchero.callbacks import ProgbarLogger as Logger, CSVLogger

class Network(nn.Module):
    def __init__(self):
        super(Network, self).__init__()
        self.filter = nn.Sequential(nn.Conv2d(in_channels=1, out_channels=32, kernel_size=5),
                                    nn.ReLU(inplace=True),
                                    nn.BatchNorm2d(32),
                                    nn.MaxPool2d(2),
                                    nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3),
                                    nn.ReLU(inplace=True),
                                    nn.BatchNorm2d(64),
                                    nn.MaxPool2d(2))
        self.linear = nn.Sequential(nn.Linear(5*5*64, 500),
                                    nn.BatchNorm1d(500),
                                    nn.ReLU(inplace=True),
                                    nn.Linear(500, 10))

    def forward(self, x):
        bs = x.shape[0]
        return self.linear(self.filter(x).view(bs, -1))

train_ds = MNIST(root='data/',
                 download=True,
                 train=True,
                 transform=transforms.Compose([transforms.ToTensor()]))
test_ds = MNIST(root='data/',
                download=False,
                train=False,
                transform=transforms.Compose([transforms.ToTensor()]))
train_dl = DataLoader(train_ds, batch_size=50)
test_dl = DataLoader(test_ds, batch_size=50)

model = Network()

trainer = SupervisedTrainer(model=model,
                            optimizer='sgd',
                            criterion='cross_entropy',
                            acc_meters={'acc': 'categorical_accuracy_percentage'},
                            callbacks=[Logger(),
                                       CSVLogger(output='training_stats.csv')
                                      ])

# If you want to use cuda uncomment the next line
# trainer.cuda()

trainer.train(dataloader=train_dl,
              valid_dataloader=test_dl,
              epochs=2)

Trainers

  • BatchTrainer: Abstract class for all trainers that works with batched inputs
  • SupervisedTrainer: Training for supervised tasks
  • AutoencoderTrainer: Trainer for auto encoder tasks

Callbacks

  • callbacks.Callback: Base callback class for all epoch/training events
  • callbacks.History: Callback that record history of all training/validation metrics
  • callbacks.Logger: Callback that display metrics per logging step
  • callbacks.ProgbarLogger: Callback that displays progress bars to monitor training/validation metrics
  • callbacks.CallbackContainer: Callback to group multiple hooks
  • callbacks.ModelCheckpoint: Callback to save best model after every epoch
  • callbacks.EarlyStopping: Callback to stop training when monitored quanity not improves
  • callbacks.CSVLogger: Callback that export training/validation stadistics to a csv file

Meters

  • meters.BaseMeter: Interface for all meters
  • meters.BatchMeters: Superclass of meters that works with batchs
  • meters.CategoricalAccuracy: Meter for accuracy on categorical targets
  • meters.BinaryAccuracy: Meter for accuracy on binary targets (assuming normalized inputs)
  • meters.BinaryAccuracyWithLogits: Binary accuracy meter with an integrated activation function (by default logistic function)
  • meters.ConfusionMatrix: Meter for confusion matrix.
  • meters.MSE: Mean Squared Error meter
  • meters.MSLE: Mean Squared Log Error meter
  • meters.RMSE: Rooted Mean Squared Error meter
  • meters.RMSLE: Rooted Mean Squared Log Error meter

Cross validation

  • utils.data.CrossFoldValidation: Itererator through cross-fold-validation folds

Datasets

  • utils.data.datasets.SubsetDataset: Dataset that is a subset of the original dataset
  • utils.data.datasets.ShrinkDatset: Shrinks a dataset
  • utils.data.datasets.UnsuperviseDataset: Makes a dataset unsupervised

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchero-0.0.2.tar.gz (34.0 kB view details)

Uploaded Source

Built Distribution

torchero-0.0.2-py3-none-any.whl (47.6 kB view details)

Uploaded Python 3

File details

Details for the file torchero-0.0.2.tar.gz.

File metadata

  • Download URL: torchero-0.0.2.tar.gz
  • Upload date:
  • Size: 34.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for torchero-0.0.2.tar.gz
Algorithm Hash digest
SHA256 9d0cab491c2c5784276bbcf80433c0671c8f9920711e7f6fd319d0335eb3d699
MD5 50379b657e1fe5e94132ea8ad83ffda8
BLAKE2b-256 f15c37c9514b4b1f8febe6302886b8abde30c79f0d4117a132dd0613579a24d6

See more details on using hashes here.

File details

Details for the file torchero-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: torchero-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 47.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for torchero-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 17b1484bc8f9b74db730d272810fd26f30e31f7e33a4c0dd1ef739e848e1babe
MD5 6877ca5338fc1aaacdda8b80ef69dfed
BLAKE2b-256 323f3c8feaf60e7fc453094be8008c63e2f6df876587326c5b302f4ef6466de5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page