Skip to main content

Implementaton of random smooth gray value transformations (rsgt) for training gray value independent neural networks

Project description

Test status Code coverage Documentation status pypi

Random Smooth Grayvalue Transformations

Convolutional neural networks trained for a detection or segmentation task in a specific type of medical gray value images, such as CT or MR images, typically fail in other medical gray value images, even if the target structure looks similar in both types of images. Random smooth gray value transformations are a data augmentation technique aimed at forcing the network to become gray value invariant. During training, the gray value of the training images or patches are randomly changed, but using a smooth and continous transfer function so that shape and texture information is largely retained.

API documentation: http://rsgt.readthedocs.io/

Installation

To use data augmentation with random smooth gray value transformations in your own project, simply install the rsgt package:

pip install rsgt
  • Requires Python 2.7+ or 3.5+
  • Numpy is the only other dependency

Data augmentation

The expected input is a numpy array with integer values, which is usually the case for medical gray value images, such as CT and MR scans.

from rsgt.augmentation import random_smooth_grayvalue_transform

# Apply gray value transformation to a numpy array
new_image = random_smooth_grayvalue_transform(image, dtype='float32')

The returned numpy array will have a floating point dataype and values in the range [0,1].

Mini-batches

While the function supports input data with any number of dimensions, it does currently not support mini-batches. A mini-batch of 3D images can be treated as a 4D input array, but all images in the mini-batch will then be subject to the same transformation. This means that a single random look up table will be computed and applied to all images in the mini-batch. There is currently no vectorized implementation of the transformation function, so a for loop is at this point the only way to transform images in a mini-batch with different transformation functions:

for i in range(minibatch.shape[0]):
    minibatch[i] = random_smooth_grayvalue_transform(minibatch[i], dtype='float32')

Examples

Original CT scanTransformed CT scan #1Transformed CT scan #2Transformed CT scan #3

The left most image is the original CT slice. The other images show the same slice with random smooth gray value transformations applied. The transformation function is shown below the transformed image.

This CT scan is from the kits19 challenge (CC BY-NC-SA 4.0 license).

Normalization functions

Because the augmentation function returns values in the range [0,1], it is necessary to either also apply the gray value transformation at inference time, or to normalize input images at inference time to [0,1]. The rsgt package comes with helper functions for CT and MR scans:

CT scans

Expected values of the original image are Hounsfield units ranging from -1000 for air (and below for background outside the image area) to around 3000 for very dense structures like metal implants.

from rsgt.normalization import normalize_ct_scan
normalized_image = normalize_ct_scan(image, dtype='float32')

MR scans

Because values of MR scans are not standardized like those of CT scans, the normalization is based on the 5% and the 95% percentiles of the input values. Values below and above are clipped.

from rsgt.normalization import normalize_mr_scan
normalized_image = normalize_mr_scan(image, dtype='float32')

This normalization can also be used in combination with the augmentation technique:

from rsgt.augmentation import random_smooth_grayvalue_transform
from rsgt.normalization import normalize_mr_scan

N = 4096  # number of bins
normalized_integer_image = (normalize_mr_scan(image, dtype='float32') * N).round().astype(int)
new_image = random_smooth_grayvalue_transform(normalized_integer_image, min_max_val=(0, N), dtype='float32')

Citation

Please cite our short paper describing random smooth gray value transformations for data augmentation when using this technique in your work:

N. Lessmann and B. van Ginneken, "Random smooth gray value transformations for cross modality learning with gray value invariant networks", arXiv:2003.06158

License

This package is released under the MIT license, as found in the LICENSE file, with the exception of the images in the /examples directory, which are released under a Creative Commons license (CC BY-NC-SA 4.0).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rsgt-1.1.0.tar.gz (5.0 kB view details)

Uploaded Source

Built Distribution

rsgt-1.1.0-py2.py3-none-any.whl (6.0 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file rsgt-1.1.0.tar.gz.

File metadata

  • Download URL: rsgt-1.1.0.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for rsgt-1.1.0.tar.gz
Algorithm Hash digest
SHA256 bfe9c460b6ae127b65991a51f12881193e3e25d38fb2a7c329cb8a2036941739
MD5 bb058e4113c1994ed9cfce0a492b8500
BLAKE2b-256 01d1256b00952e297223eaa725819734d8b738604c2c48aaca5a75eb4df61210

See more details on using hashes here.

File details

Details for the file rsgt-1.1.0-py2.py3-none-any.whl.

File metadata

  • Download URL: rsgt-1.1.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 6.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for rsgt-1.1.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 d9b84af458ef15965e1d5d189217bd5bacc755fc1a7550c51c34893e695d3d79
MD5 0b26494342f0ce57274dc8e54f57908a
BLAKE2b-256 0d693792793597f9b667c1aa05769d2912320b5a3bf70b8b94068e1550c8de7c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page