Skip to main content

Python package for AdaS: Adaptive Scheduling of Stochastic Gradients

Project description

Adas: Adaptive Scheduling of Stochastic Gradients

Status

License: MIT maintenance python size

Table of Contents

Introduction

AdaS is an adaptive optimizer for scheduling the learning rate in training Convolutional Neural Networks (CNN)

  • AdaS exhibits the rapid minimization characteristics that adaptive optimizers like AdaM are favoured for
  • AdaS exhibits generalization (low testing loss) characteristics on par with SGD based optimizers, improving on the poor generalization characteristics of adaptive optimizers
  • AdaS introduces no computational overhead over adaptive optimizers (see experimental results)
  • In addition to optimization, AdaS introduces new probing metrics for CNN layer evaulation (quality metrics)

This repository contains a PyTorch implementation of the AdaS learning rate scheduler algorithm as well as the Knowledge Gain and Mapping Condition metrics.

Visit the paper branch to see the paper-related code. You can use that code to replicate experiments from the paper.

License

AdaS is released under the MIT License (refer to the LICENSE file for more information)

Permissions Conditions Limitations
license Commerical use license License and Copyright Notice license Liability
license Distribution license Warranty
license Modification
license Private Use

Citing AdaS

@article{hosseini2020adas,
  title={AdaS: Adaptive Scheduling of Stochastic Gradients},
  author={Hosseini, Mahdi S and Plataniotis, Konstantinos N},
  journal={arXiv preprint arXiv:2006.06587},
  year={2020}
}

Empirical Classification Results on CIFAR10, CIFAR100 and Tiny-ImageNet-200

Figure 1: Training performance using different optimizers across three datasets and two CNNs figure 1

Table 1: Image classification performance (test accuracy) with fixed budget epoch of ResNet34 training table 1

QC Metrics

Please refer to QC on Wiki for more information on two metrics of knowledge gain and mapping condition for monitoring training quality of CNNs

Requirements

Software/Hardware

We use Python 3.7.

Please refer to Requirements on Wiki for complete guideline.

Computational Overhead

AdaS introduces no overhead (very minimal) over adaptive optimizers e.g. all mSGD+StepLR, mSGD+AdaS, AdaM consume 40~43 sec/epoch to train ResNet34/CIFAR10 using the same PC/GPU platform

Installation

  1. You can install AdaS directly from PyPi using `pip install adas', or clone this repository and install from source.
  2. You can also download the files in src/adas into your local code base and use them directly. Note that you will probably need to modify the imports to be consistent with however you perform imports in your codebase.

All source code can be found in src/adas

For more information, also refer to Installation on Wiki

Usage

The use AdaS, simply import the AdaS(torch.optim.optimier.Optimizer) class and use it as follows:

from adas import AdaS

optimizer = AdaS(params=model.parameters(),
                 listed_params=list(model.parameters()),
                 lr: float = ???,
                 beta: float = 0.8
                 step_size: int = None,
                 gamma: float = 1,
                 momentum: float = 0,
                 dampening: float = 0,
                 weight_decay: float = 0,
                 nesterov: bool = False):
...
for batch in train_dataset:
    ...
    loss.backward()
    optimizer.step()
optimizer.epoch_step()

Note, optipmizer.epoch_step() is just to be called at the end of each epoch.

Common Issues (running list)

  • None :)

TODO

  • Add medical imaging datasets (e.g. digital pathology, xray, and ct scans)
  • Extension of AdaS to Deep Neural Networks

Pytest

Note the following:

  • Our Pytests write/download data/files etc. to /tmp, so if you don't have a /tmp folder (i.e. you're on Windows), then correct this if you wish to run the tests yourself

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adas-1.0.0.tar.gz (39.5 kB view details)

Uploaded Source

Built Distribution

adas-1.0.0-py3-none-any.whl (48.5 kB view details)

Uploaded Python 3

File details

Details for the file adas-1.0.0.tar.gz.

File metadata

  • Download URL: adas-1.0.0.tar.gz
  • Upload date:
  • Size: 39.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.3

File hashes

Hashes for adas-1.0.0.tar.gz
Algorithm Hash digest
SHA256 c52538c3610d25d947af91def0f27531d1ee6f559b35b7b0903535b908f65cfb
MD5 68940a703d94ba31da4957cf5c385ec7
BLAKE2b-256 2a9215caa5dcba38fc7ec4a3f34c71644ab6382a602fcfc4dfb530933765787b

See more details on using hashes here.

File details

Details for the file adas-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: adas-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 48.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.3

File hashes

Hashes for adas-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ec3b30774d8605d7642e473e4cba868f315986cfc70ade9ee39620d00c317641
MD5 3300b68d65fc79dc9197c650a949b0a1
BLAKE2b-256 6b3fbe7c98e8f8f1666b11ed70ccd77e0775540c73161fc28a0bdad0106f68d4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page