Skip to main content

Setup and train deep nets with PyTorch. Opinionated and Simple.

Project description

Configure and train deep feedforward PyTorch models with a lot of the details already or partially implemented.

DISCLAIMER: At the moment, this repo is used for my research. New versions are not necessarily backwards compatible. The API is subject to change at a moment's notice. If you happen to use it in your research or work, make sure in your requirements.txt to pin the version or reference the specific commit you used so you don't suffer unwanted surprises.

NOTE: This readme and examples are a bit dated. They describe a method based on the FeedForward class, but I subsequently decided that using inheritance as designed in the FeedForward class is too cumbersome for most projects. I use this repo for its datasets, logging tools, metrics, plotting results, and other functions available simplepytorch.api.

Motivation and useful features:

  • Clarity: Much research using PyTorch mixes tedious boiler plate code (like argparse configuration, standard training loop code, logging) with the contribution of your work (ie a new enhancement method, model or training style). By design, this repo tries to force you as a programmer to better separate the standard PyTorch code from your research contribution.
  • Simplicity from Command-line: All key parameters should be automatically exposed on the command-line. This library converts all public class variables in your Model Config class into an organized list of command-line arguments. This enables reproducible and highly configurable experiments.
  • Reproducibility: The logging infrastructure organizes all results, logs and model checkpoints for a particular experiment, identified by run_id into a dedicated directory. All configuration for your model can be defined at command-line.
  • Easy to get started: There can be a dizzying array of little details to implement when training a PyTorch model. Forgetting these details often leads to bugs and experiments with missing or incorrect results. The library (specifically the FeedForward class) gives a straightforward recipe and list of functions to implement.
  • Datasets: PyTorch Dataset implementations for data I use in my research. Mostly retinal fundus image datasets. You must download and unzip the datasets yourself. A download link is usually in the class docstring.

Install

pip install --upgrade simplepytorch

Quick Start

Preliminaries: get a dataset and set up a project.

#
# set up a project
#
# --> create a directory for your project
mkdir -p ./myproject/data
# --> copy the examples directory (from this repo)
cp -rf ./examples ./myproject/
# --> link your pre-trained torch models into ./data if you want.
ln -sr ~/.torch ./myproject/data/torch
# --> now go download the RITE dataset and unzip it into ./myproject/data/RITE
ls ./myproject/data/RITE
# ls output: AV_groundTruth.zip  introduction.txt  read_me.txt  test  training

cd ./myproject
# --> ask Python to register the code in ./examples as a package
export PYTHONPATH=.:$PYTHONPATH

Train the model from command-line and get results

#
# train the model from command-line
#
simplepytorch ./examples/ -h
simplepytorch ./examples/ LetsTrainSomething -h
simplepytorch ./examples/ LetsTrainSomething --run-id experimentA --epochs 3
# alternative command-line ways to start code
run_id=experimentB epochs=3 simplepytorch ./examples/ LetsTrainSomething
simplepytorch ./examples/my_feedforward_model_config.py LetsTrainSomething

# --> debug your model with IPython
simplepytorch_debug ./examples/ LetsTrainSomething --run-id experimentA --epochs a
# --> now you can type %debug to drop into a PDB debugger.  Move around by typing `up` and `down`

# check the results
ls ./data/results/experimentA
tail -f ./data/results/experimentA/perf.csv 
# --> plot results for all experiments matching a regex
simplepytorch_plot 'experiment.*' --ns

Programmatic access via the simplepytorch API:

import examples
import simplepytorch.api as api

cfg = api.load_model_config(examples.LetsTrainSomething, '--epochs 1')
cfg.train()

Developing your own pytorch code

Check the examples directory for a simple getting started template. You can train a model to perform vessel segmentation on the RITE dataset in about 70 lines of code.

examples/

As a next step, you can copy the examples directory, rename it to whatever your project name is and start from there. You will find, as mentioned in examples/my_feedforward_model_config.py that the api.FeedForward class typically lists everything needed. Assuming you want to use the FeedForward class, just implement or override its methods. If something isn't obvious or clear, create a GitHub issue. I will support you to the extent that I can.

Datasets:

The library provides PyTorch Dataset implementations for datasets without an already existing PyTorch implementation.

To use the pre-defined dataset classes, you must download the data and unzip it yourself. Consult Dataset class docstring for usage details.

import simplepytorch.datasets as D

dset = D.RITE(use_train_set=True)
dset[0]

For example, some downloaded datasets I use have the following structure:

 $ ls data/{arsn_qualdr,eyepacs,messidor,IDRiD_segmentation,RITE}
data/IDRiD_segmentation:
'1. Original Images'  '2. All Segmentation Groundtruths'   CC-BY-4.0.txt   LICENSE.txt

data/RITE:
AV_groundTruth.zip  introduction.txt  read_me.txt  test  training

data/arsn_qualdr:
README.md  annotations  annotations.zip  imgs1  imgs1.zip  imgs2  imgs2.zip

data/eyepacs:
README.md                 test          test.zip.003  test.zip.006  train.zip.001  train.zip.004
sample.zip                test.zip.001  test.zip.004  test.zip.007  train.zip.002  train.zip.005
sampleSubmission.csv.zip  test.zip.002  test.zip.005  train         train.zip.003  trainLabels.csv.zip

data/messidor:
Annotation_Base11.csv  Annotation_Base21.csv  Annotation_Base31.csv  Base11  Base21  Base31
Annotation_Base12.csv  Annotation_Base22.csv  Annotation_Base32.csv  Base12  Base22  Base32
Annotation_Base13.csv  Annotation_Base23.csv  Annotation_Base33.csv  Base13  Base23  Base33
Annotation_Base14.csv  Annotation_Base24.csv  Annotation_Base34.csv  Base14  Base24  Base34

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

simplepytorch-0.3.0.tar.gz (41.6 kB view details)

Uploaded Source

Built Distribution

simplepytorch-0.3.0-py3-none-any.whl (48.9 kB view details)

Uploaded Python 3

File details

Details for the file simplepytorch-0.3.0.tar.gz.

File metadata

  • Download URL: simplepytorch-0.3.0.tar.gz
  • Upload date:
  • Size: 41.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.9.6

File hashes

Hashes for simplepytorch-0.3.0.tar.gz
Algorithm Hash digest
SHA256 ede65318b34ded0d4ba3372b4b51b7f2cb9826541251fb9d6981ebd3b75c7ea9
MD5 b30f35068c4c202a855ffe87197af646
BLAKE2b-256 342437dd85f186c7f2d4596abe566e0d4d79a53468bf65933d4b05cfbff7e527

See more details on using hashes here.

File details

Details for the file simplepytorch-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: simplepytorch-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 48.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.9.6

File hashes

Hashes for simplepytorch-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 92ea19c4011d7953ef905913e9589015fbf152d74c75cba9effaf665072140f5
MD5 24b9f00536d7fb4c7084dc5f51e8665c
BLAKE2b-256 d18ae50c0490445085d94776865b05331cb1e5868c68fbf80a5027fbcdec8ae6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page