Skip to main content

Optimize RandAugment with differentiable operations

Project description

Differentiable RandAugment

Optimize RandAugment with differentiable operations

build PyPI PyPI - Python Version PyPI - Format PyPI - License codecov CodeFactor

Table of Contents

Introduction

Differentiable RandAugment is a differentiable version of RandAugment. The original paper proposed to find optimal parameters by using grid search. Instead, this library supports differentiable operations to calculate gradient of the magnitude parameter and optimize it. See getting started.

Installation

To install the latest version from PyPI:

$ pip install -U differentiable_randaugment

Or you can install from source by cloning the repository and running:

$ git clone https://github.com/affjljoo3581/Differentiable-RandAugment.git
$ cd Differentiable-RandAugment
$ python setup.py install

Dependencies

  • opencv_python
  • torch>=1.7
  • albumentations
  • numpy

Getting Started

First, create RandAugmentModule with your desired number of operations. This module is a differentiable and torch.Tensor calculable version of RandAugment policy. Using this module, you can train the policy as one of the neural-network model. Note that randomly selected num_ops operations will be applied to the images.

from differentiable_randaugment import RandAugmentModule

augmentor = RandAugmentModule(num_ops=2)

Now you need to perform the module to the images. Usually augmentations are applied in Dataset. That is, the operations use np.ndarray images. However, it cannot calculate the gradients for image and magnitude parameter (because the entire optimization procedure is based on torch.Tensors). To resolve this, you should apply this module to torch.Tensor images rather than np.ndarray.

for inputs, labels in train_dataloader:
    inputs = inputs.cuda()
    logits = model(augmentor(inputs))
    ...

Of course, other augmentations should be removed from preprocessing:

transform = Compose([
    Resize(...),
    Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)),
    ToTensorV2(),
])

And lastly, create an optimizer with this module parameters. We recommend to use different learning rate for the model and the augmentor:

param_groups = [
    {"params": augmentor.parameters(), "lr": 10 * learning_rate},
    {"params": model.parameters(), "lr": learning_rate},
]
optimizer = optim.Adam(param_groups)

Now the RandAugment policy will be trained with your prediction model.

After training RandAugmentModule, get the trained optimal magnitude value by calling augmentor.get_magnitude() and use the magnitude as follows:

from differentiable_randaugment import RandAugment

transform = Compose([
    Resize(...),
    RandAugment(num_ops=..., magnitude=...),
    Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)),
    ToTensorV2(),
])
dataset = Dataset(..., transform=transform)

While RandAugment is an extension of albumentations, you can combine other augmentations in albumentations with this class.

Support Operations

Differentiable RandAugment supports 14 operations described in the original paper. The below table shows the detailed differential specification of each operation.

Input Image Magnitude
Identity
ShearX
ShearY
TranslateX
TranslateY
Rotate
Cutout
AutoContrast
Equalize
Solarize
SolarizeAdd
Posterize
Contrast
Color
Brightness
Sharpness

License

Differentiable RandAugment is Apache-2.0 Licensed.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

differentiable_randaugment-0.1.2.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

differentiable_randaugment-0.1.2-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file differentiable_randaugment-0.1.2.tar.gz.

File metadata

  • Download URL: differentiable_randaugment-0.1.2.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.7.9

File hashes

Hashes for differentiable_randaugment-0.1.2.tar.gz
Algorithm Hash digest
SHA256 513d267676a080e06635a3dbe7c91486a41f02e4dbfcbb2e244271ed45eb21e8
MD5 df94fe29204e8d33e3337eca9def55d6
BLAKE2b-256 f52fcde6dc629df0f4deaaebb91ed10d32b9bf14e6908ca84bf8916c0280c072

See more details on using hashes here.

File details

Details for the file differentiable_randaugment-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: differentiable_randaugment-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.7.9

File hashes

Hashes for differentiable_randaugment-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d6a04a71f556483e8649e4704ee537d52d4ee3d1a643bf7b4bcc2bc66b00a9a9
MD5 001232c4adbb97f15573bfd114329f68
BLAKE2b-256 a153bc30de1b9603c89cd1899ff77bbb631a7c28b3ac1c84e656b507c9cc4c53

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page