Skip to main content

A tiny test suite for pytorch based machine learning models.

Project description

Tiny Torchtest

A Tiny Test Suite for pytorch based Machine Learning models, inspired by mltest. Chase Roberts lists out 4 basic tests in his medium post about mltest. torchtest is mostly a pytorch port of mltest(which was written for tensorflow).


Forked from BrainPugh who forked the repo from suriyadeepan.

Notable changes:

  • Support for models to have multiple positional arguments.

  • Support for unsupervised learning.

  • Fewer requirements (due to streamlining testing).

  • More comprehensive changes.

  • This repository is still active. I've created an issue to double check but it looks like the original maintainer is no longer actioning pull requests.


Installation

pip install --upgrade torchtest

Tests

# imports for examples
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.autograd import Variable

Variables Change

from torchtest import assert_vars_change

inputs = Variable(torch.randn(20, 20))
targets = Variable(torch.randint(0, 2, (20,))).long()
batch = [inputs, targets]
model = nn.Linear(20, 2)

# what are the variables?
print('Our list of parameters', [ np[0] for np in model.named_parameters() ])

# do they change after a training step?
#  let's run a train step and see
assert_vars_change(
    model=model,
    loss_fn=F.cross_entropy,
    optim=torch.optim.Adam(model.parameters()),
    batch=batch)
""" FAILURE """
# let's try to break this, so the test fails
params_to_train = [ np[1] for np in model.named_parameters() if np[0] is not 'bias' ]
# run test now
assert_vars_change(
    model=model,
    loss_fn=F.cross_entropy,
    optim=torch.optim.Adam(params_to_train),
    batch=batch)

# YES! bias did not change

Variables Don't Change

from torchtest import assert_vars_same

# What if bias is not supposed to change, by design?
#  test to see if bias remains the same after training
assert_vars_same(
    model=model,
    loss_fn=F.cross_entropy,
    optim=torch.optim.Adam(params_to_train),
    batch=batch,
    params=[('bias', model.bias)]
    )
# it does? good. let's move on

Output Range

from torchtest import test_suite

# NOTE : bias is fixed (not trainable)
optim = torch.optim.Adam(params_to_train)
loss_fn=F.cross_entropy

test_suite(model, loss_fn, optim, batch,
    output_range=(-2, 2),
    test_output_range=True
    )

# seems to work
""" FAILURE """
#  let's tweak the model to fail the test
model.bias = nn.Parameter(2 + torch.randn(2, ))

test_suite(
    model,
    loss_fn, optim, batch,
    output_range=(-1, 1),
    test_output_range=True
    )

# as expected, it fails; yay!

NaN Tensors

""" FAILURE """
model.bias = nn.Parameter(float('NaN') * torch.randn(2, ))

test_suite(
    model,
    loss_fn, optim, batch,
    test_nan_vals=True
    )

Inf Tensors

""" FAILURE """
model.bias = nn.Parameter(float('Inf') * torch.randn(2, ))

test_suite(
    model,
    loss_fn, optim, batch,
    test_inf_vals=True
    )

Debugging

torchtest\torchtest.py", line 151, in _var_change_helper
assert not torch.equal(p0, p1)
RuntimeError: Expected object of backend CPU but got backend CUDA for argument #2 'other'

When you are making use of a GPU, you should explicitly specify device=cuda:0. By default device is set to cpu. See issue #1 for more information.

test_suite(
    model,  # a model moved to GPU
    loss_fn, optim, batch,
    test_inf_vals=True,
    device='cuda:0'
    )

Citation

@misc{Ram2019,
  author = {Suriyadeepan Ramamoorthy},
  title = {torchtest},
  year = {2019},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/suriyadeepan/torchtest}},
  commit = {42ba442e54e5117de80f761a796fba3589f9b223}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tinytorchtest-0.7.1.tar.gz (18.8 kB view details)

Uploaded Source

Built Distribution

tinytorchtest-0.7.1-py3-none-any.whl (18.5 kB view details)

Uploaded Python 3

File details

Details for the file tinytorchtest-0.7.1.tar.gz.

File metadata

  • Download URL: tinytorchtest-0.7.1.tar.gz
  • Upload date:
  • Size: 18.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.12 CPython/3.9.10 Linux/5.15.32-gentoo-dist

File hashes

Hashes for tinytorchtest-0.7.1.tar.gz
Algorithm Hash digest
SHA256 c3b70b51ef52993f05a48adf9292e7197e46f5e1f53978e7836e2526284d70af
MD5 c146909ef22f557fa5eea445bd7f4d7f
BLAKE2b-256 80228f2757ed78dcbd066417193d59395df34f033227df9bdc6155d07ca579fd

See more details on using hashes here.

File details

Details for the file tinytorchtest-0.7.1-py3-none-any.whl.

File metadata

  • Download URL: tinytorchtest-0.7.1-py3-none-any.whl
  • Upload date:
  • Size: 18.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.12 CPython/3.9.10 Linux/5.15.32-gentoo-dist

File hashes

Hashes for tinytorchtest-0.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 bc4085c7442204c78e57271781ded5118a855eda1e2a8a03cf3bf5daa756209b
MD5 46b6801a0840ec6e0948862262ac9c53
BLAKE2b-256 800525c43d3ba223aeb4486b3bdb756e340fb270d2411f7d116b3e3133c35312

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page