Skip to main content

Python package for AdaS: Adaptive Scheduling of Stochastic Gradients

Project description

Adas: Adaptive Scheduling of Stochastic Gradients

Status

License: MIT maintenance python size

Table of Contents

Introduction

Adas is an adaptive optimizer for scheduling the learning rate in training Convolutional Neural Networks (CNN)

  • Adas exhibits the rapid minimization characteristics that adaptive optimizers like AdaM are favoured for
  • Adas exhibits generalization (low testing loss) characteristics on par with SGD based optimizers, improving on the poor generalization characteristics of adaptive optimizers
  • Adas introduces no computational overhead over adaptive optimizers (see experimental results)
  • In addition to optimization, Adas introduces new probing metrics for CNN layer evaulation (quality metrics)

This repository contains a PyTorch implementation of the Adas learning rate scheduler algorithm as well as the Knowledge Gain and Mapping Condition metrics.

Visit the paper branch to see the paper-related code. You can use that code to replicate experiments from the paper.

License

Adas is released under the MIT License (refer to the LICENSE file for more information)

Permissions Conditions Limitations
license Commerical use license License and Copyright Notice license Liability
license Distribution license Warranty
license Modification
license Private Use

Citing Adas

@article{hosseini2020adas,
  title={Adas: Adaptive Scheduling of Stochastic Gradients},
  author={Hosseini, Mahdi S and Plataniotis, Konstantinos N},
  journal={arXiv preprint arXiv:2006.06587},
  year={2020}
}

Empirical Classification Results on CIFAR10, CIFAR100 and Tiny-ImageNet-200

Figure 1: Training performance using different optimizers across three datasets and two CNNs figure 1

Table 1: Image classification performance (test accuracy) with fixed budget epoch of ResNet34 training table 1

QC Metrics

Please refer to QC on Wiki for more information on two metrics of knowledge gain and mapping condition for monitoring training quality of CNNs

Requirements

Software/Hardware

We use Python 3.7.

Please refer to Requirements on Wiki for complete guideline.

Computational Overhead

Adas introduces no overhead (very minimal) over adaptive optimizers e.g. all mSGD+StepLR, mSGD+Adas, AdaM consume 40~43 sec/epoch to train ResNet34/CIFAR10 using the same PC/GPU platform

Installation

  1. You can install Adas directly from PyPi using `pip install adas', or clone this repository and install from source.
  2. You can also download the files in src/adas into your local code base and use them directly. Note that you will probably need to modify the imports to be consistent with however you perform imports in your codebase.

All source code can be found in src/adas

For more information, also refer to Installation on Wiki

Usage

The use Adas, simply import the Adas(torch.optim.optimier.Optimizer) class and use it as follows:

from adas import Adas

optimizer = Adas(params=list(model.parameters()),
                 lr: float = ???,
                 beta: float = 0.8
                 step_size: int = None,
                 gamma: float = 1,
                 momentum: float = 0,
                 dampening: float = 0,
                 weight_decay: float = 0,
                 nesterov: bool = False):
...
for epoch in epochs:
    for batch in train_dataset:
        ...
        loss.backward()
        optimizer.step()
    optimizer.epoch_step(epoch)

Note, optipmizer.epoch_step() is just to be called at the end of each epoch.

Common Issues (running list)

  • None :)

TODO

  • Add medical imaging datasets (e.g. digital pathology, xray, and ct scans)
  • Extension of Adas to Deep Neural Networks

Pytest

Note the following:

  • Our Pytests write/download data/files etc. to /tmp, so if you don't have a /tmp folder (i.e. you're on Windows), then correct this if you wish to run the tests yourself

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adas-1.1.3.tar.gz (18.8 kB view details)

Uploaded Source

Built Distribution

adas-1.1.3-py3-none-any.whl (18.9 kB view details)

Uploaded Python 3

File details

Details for the file adas-1.1.3.tar.gz.

File metadata

  • Download URL: adas-1.1.3.tar.gz
  • Upload date:
  • Size: 18.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.3

File hashes

Hashes for adas-1.1.3.tar.gz
Algorithm Hash digest
SHA256 f3521cf11f1345ffa8f100d0b3d4e988f5aeefbb132eab61db6a66bf6c76cfc6
MD5 8e4030ab3a94a3624c189807191b4cb3
BLAKE2b-256 958240273b6f3483851e51c9f77f16bdde640d17223088096ed9bc3d928ecd5a

See more details on using hashes here.

File details

Details for the file adas-1.1.3-py3-none-any.whl.

File metadata

  • Download URL: adas-1.1.3-py3-none-any.whl
  • Upload date:
  • Size: 18.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.3

File hashes

Hashes for adas-1.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 9f5572e8b1bd5cf33fb9d74fb2bf3ee81ed53384efb25530c9b326af749d7476
MD5 8fcdc359ab1d149f11de2460f0ec6e68
BLAKE2b-256 92019c80c333010abdd8a7061d06df073da7ef9ab52dcf55eb2c58df51fc3d87

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page