Utilities for nolearn.lasagne
Project description
# nolearn-utils
[![Build Status](https://travis-ci.org/felixlaumon/nolearn_utils.svg?branch=master)](https://travis-ci.org/felixlaumon/nolearn_utils)
Iterators and handlers for nolearn.lasagne to allow efficient real-time image augmentation and training progress monitoring
## Real-time image augmentation
- `ShuffleBatchIteratorMixin` to shuffle training samples
- `ReadImageBatchIteratorMixin` to transform image file path into image as color or as gray, and with specified image size
- `RandomFlipBatchIteratorMixin` to randomly (uniform) flip the image horizontally or verticaly
- `AffineTransformBatchIteratorMixin` to apply affine transformation (scale, rotate, translate) to randomly selected images from the given transformation options - `BufferedBatchIteratorMixin` to perform transformation in another thread automatically and put the result in a buffer (default size = 5)
- `LCNBatchIteratorMixin` to perform local contrast normalization to images
- `MeanSubtractBatchIteratorMixin` to subtract samples from the pre-calculated mean
Example of using iterators as below:
train_iterator_mixins = [
ShuffleBatchIteratorMixin,
ReadImageBatchIteratorMixin,
RandomFlipBatchIteratorMixin,
AffineTransformBatchIteratorMixin,
BufferedBatchIteratorMixin,
]
TrainIterator = make_iterator('TrainIterator', train_iterator_mixins)
train_iterator_kwargs = {
'buffer_size': 5,
'batch_size': batch_size,
'read_image_size': (image_size, image_size),
'read_image_as_gray': False,
'read_image_prefix_path': './data/train/',
'flip_horizontal_p': 0.5,
'flip_vertical_p': 0,
'affine_p': 0.5,
'affine_scale_choices': np.linspace(0.75, 1.25, 5),
'affine_translation_choices': np.arange(-3, 4, 1),
'affine_rotation_choices': np.arange(-45, 50, 5)
}
train_iterator = TrainIterator(**train_iterator_kwargs)
The `BaseBatchIterator` is also modified from `nolearn.lasagne` to provide a progress bar for training process for each iteration
## Handlers
- `EarlyStopping` stops training when loss stop improving
- `StepDecay` to gradually reduce a parameter (e.g. learning rate) over time
- `SaveTrainingHistory` to save training history (e.g. training loss)
- `PlotTrainingHistory` to plot out training loss and validation accuracy
over time after each iteration with matplotlib
## Examples
Example code requires `scikit-learn`
### MNIST
`example/mnist/train.py` should produce a model of about 99.5% accuracy in less than 50 epoch.
MNIST data can be downloaded from
[Kaggle](https://www.kaggle.com/c/digit-recognizer).
### CIFAR10
CIFAR10 images can be downloaded from [Kaggle](https://www.kaggle.com/c/cifar-10/data). Place the downloaded data as follows:
examples/cifar10
├── data
│ ├── train
│ | ├── 1.png
│ | ├── 2.png
│ | ├── 3.png
│ | ├── ...
│ └── trainLabels.csv
└── train.py
`example/cifat10/train.py` should produce a model at about 85% accuracy at 100 epoch. Images are read from disk and augmented at training time (from another thread)
## TODO
- [ ] Embarrassingly parallelize transform
## License
MIT & BSD
[![Build Status](https://travis-ci.org/felixlaumon/nolearn_utils.svg?branch=master)](https://travis-ci.org/felixlaumon/nolearn_utils)
Iterators and handlers for nolearn.lasagne to allow efficient real-time image augmentation and training progress monitoring
## Real-time image augmentation
- `ShuffleBatchIteratorMixin` to shuffle training samples
- `ReadImageBatchIteratorMixin` to transform image file path into image as color or as gray, and with specified image size
- `RandomFlipBatchIteratorMixin` to randomly (uniform) flip the image horizontally or verticaly
- `AffineTransformBatchIteratorMixin` to apply affine transformation (scale, rotate, translate) to randomly selected images from the given transformation options - `BufferedBatchIteratorMixin` to perform transformation in another thread automatically and put the result in a buffer (default size = 5)
- `LCNBatchIteratorMixin` to perform local contrast normalization to images
- `MeanSubtractBatchIteratorMixin` to subtract samples from the pre-calculated mean
Example of using iterators as below:
train_iterator_mixins = [
ShuffleBatchIteratorMixin,
ReadImageBatchIteratorMixin,
RandomFlipBatchIteratorMixin,
AffineTransformBatchIteratorMixin,
BufferedBatchIteratorMixin,
]
TrainIterator = make_iterator('TrainIterator', train_iterator_mixins)
train_iterator_kwargs = {
'buffer_size': 5,
'batch_size': batch_size,
'read_image_size': (image_size, image_size),
'read_image_as_gray': False,
'read_image_prefix_path': './data/train/',
'flip_horizontal_p': 0.5,
'flip_vertical_p': 0,
'affine_p': 0.5,
'affine_scale_choices': np.linspace(0.75, 1.25, 5),
'affine_translation_choices': np.arange(-3, 4, 1),
'affine_rotation_choices': np.arange(-45, 50, 5)
}
train_iterator = TrainIterator(**train_iterator_kwargs)
The `BaseBatchIterator` is also modified from `nolearn.lasagne` to provide a progress bar for training process for each iteration
## Handlers
- `EarlyStopping` stops training when loss stop improving
- `StepDecay` to gradually reduce a parameter (e.g. learning rate) over time
- `SaveTrainingHistory` to save training history (e.g. training loss)
- `PlotTrainingHistory` to plot out training loss and validation accuracy
over time after each iteration with matplotlib
## Examples
Example code requires `scikit-learn`
### MNIST
`example/mnist/train.py` should produce a model of about 99.5% accuracy in less than 50 epoch.
MNIST data can be downloaded from
[Kaggle](https://www.kaggle.com/c/digit-recognizer).
### CIFAR10
CIFAR10 images can be downloaded from [Kaggle](https://www.kaggle.com/c/cifar-10/data). Place the downloaded data as follows:
examples/cifar10
├── data
│ ├── train
│ | ├── 1.png
│ | ├── 2.png
│ | ├── 3.png
│ | ├── ...
│ └── trainLabels.csv
└── train.py
`example/cifat10/train.py` should produce a model at about 85% accuracy at 100 epoch. Images are read from disk and augmented at training time (from another thread)
## TODO
- [ ] Embarrassingly parallelize transform
## License
MIT & BSD
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nolearn_utils-0.2.2.tar.gz
(10.4 kB
view details)
Built Distribution
File details
Details for the file nolearn_utils-0.2.2.tar.gz
.
File metadata
- Download URL: nolearn_utils-0.2.2.tar.gz
- Upload date:
- Size: 10.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 11dc28c911e21ca8fbff97d3baeb0a1997572fd1fc6c40435a73c588556cf7f3 |
|
MD5 | 8a3b9bc30994eff644f30ab048532367 |
|
BLAKE2b-256 | 3c18061d52100f899bbee77b3aa0cd113b29aa876d32f2526e85962b813bcb0e |
File details
Details for the file nolearn_utils-0.2.2-py2.py3-none-any.whl
.
File metadata
- Download URL: nolearn_utils-0.2.2-py2.py3-none-any.whl
- Upload date:
- Size: 14.6 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 093db465c1a0b3455a561dd39f8d0a7896d1267958708471eb83d9188d322736 |
|
MD5 | 1e6ff50a36a8e3b91edbe65729484608 |
|
BLAKE2b-256 | d61354477753a56b01d6cdb30699c9b2b2eb14edabce81c01ba5b9ab3d47cb98 |