Skip to main content

NEural Minimizer for pytOrch

Project description

NEMO (NEural Minimizer for pytOrch)

NEMO (NEural Minimizer for pytOrch) is a small library for minimization of Deep Neural Networks developed in PyTorch, aimed at their deployment on ultra-low power, highly memory constrained platforms, in particular (but not exclusively) PULP-based microcontrollers. NEMO features include:

  • deployment-related transformations such as BatchNorm folding, bias removal, weight equalization
  • collection of statistics on activations and weights
  • post-training quantization
  • quantization-aware fine-tuning, with partially automated precision relaxation
  • mixed-precision quantization
  • bit-accurate deployment model
  • export to ONNX

NEMO operates on three different "levels" of quantization-aware DNN representations, all built upon torch.nn.Module and torch.autograd.Function:

  • fake-quantized FQ: replaces regular activations (e.g., ReLU) with quantization-aware ones (PACT) and dynamically quantized weights (with linear PACT-like quantization), maintaining full trainability (similar to the native PyTorch support, but not based on it).
  • quantized-deployable QD: replaces all function with deployment-equivalent versions, trading off trainability for a more accurate representation of numerical behavior on real hardware.
  • integer-deployable ID: replaces all activation and weight tensors used along the network with integer-based ones. It aims at bit-accurate representation of actual hardware behavior. All the quantized representations support mixed-precision weights (signed and asymmetric) and activations (unsigned). The current version of NEMO targets per-layer quantization; work on per-channel quantization is in progress.

NEMO is organized as a Python library that can be applied with relatively small changes to an existing PyTorch based script or training framework.

Installation and requirements

The NEMO library currently supports PyTorch >= 1.3.1 and runs on Python >= 3.5. To install it from PyPI, just run

pip install pytorch-nemo

Then, you can import it in your script using

import nemo

Example

Documentation

Full documentation for NEMO is under development (see doc folder). You can find a technical report covering the deployment-aware quantization methodology here: https://arxiv.org/abs/2004.05930

License

NEMO is released under Apache 2.0, see the LICENSE file in the root of this repository for details.

Acknowledgements

ALOHA Logo

NEMO is an outcome of the European Commission Horizon 2020 ALOHA Project, funded under the EU's Horizon 2020 Research and Innovation Programme, grant agreement no. 780788.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-nemo-0.0.3.tar.gz (36.6 kB view details)

Uploaded Source

Built Distribution

pytorch_nemo-0.0.3-py3-none-any.whl (52.8 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-nemo-0.0.3.tar.gz.

File metadata

  • Download URL: pytorch-nemo-0.0.3.tar.gz
  • Upload date:
  • Size: 36.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for pytorch-nemo-0.0.3.tar.gz
Algorithm Hash digest
SHA256 3f9d2ded70427ac2fc70d4e4a787c2574a4794a710c26de1aba89d46286f10b8
MD5 ff6ab783b9ff6175acc8669b06a9d320
BLAKE2b-256 dd33eafcff8c1d04b0a86749935a2437a16a328c125f949e2f8332ab22cf5125

See more details on using hashes here.

File details

Details for the file pytorch_nemo-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: pytorch_nemo-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 52.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for pytorch_nemo-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 f845eaee510d62e37fb759def7d099d6da58bf425ce5f12a477ab7137643c03e
MD5 3ea4b75614d82e44209edabf096562dd
BLAKE2b-256 38eeff008f3cdf506a902be56a92de639af6752ba5886433f03b9579e477cccc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page