NEural Minimizer for pytOrch
Project description
NEMO (NEural Minimizer for pytOrch)
NEMO (NEural Minimizer for pytOrch) is a small library for minimization of Deep Neural Networks developed in PyTorch, aimed at their deployment on ultra-low power, highly memory constrained platforms, in particular (but not exclusively) PULP-based microcontrollers. NEMO features include:
- deployment-related transformations such as BatchNorm folding, bias removal, weight equalization
- collection of statistics on activations and weights
- post-training quantization
- quantization-aware fine-tuning, with partially automated precision relaxation
- mixed-precision quantization
- bit-accurate deployment model
- export to ONNX
NEMO operates on three different "levels" of quantization-aware DNN representations, all built upon torch.nn.Module
and torch.autograd.Function
:
- fake-quantized FQ: replaces regular activations (e.g., ReLU) with quantization-aware ones (PACT) and dynamically quantized weights (with linear PACT-like quantization), maintaining full trainability (similar to the native PyTorch support, but not based on it).
- quantized-deployable QD: replaces all function with deployment-equivalent versions, trading off trainability for a more accurate representation of numerical behavior on real hardware.
- integer-deployable ID: replaces all activation and weight tensors used along the network with integer-based ones. It aims at bit-accurate representation of actual hardware behavior. All the quantized representations support mixed-precision weights (signed and asymmetric) and activations (unsigned). The current version of NEMO targets per-layer quantization; work on per-channel quantization is in progress.
NEMO is organized as a Python library that can be applied with relatively small changes to an existing PyTorch based script or training framework.
Installation and requirements
The NEMO library currently supports PyTorch >= 1.3.1 and runs on Python >= 3.5. To install it from PyPI, just run:
pip install pytorch-nemo
You can also install a development (and editable) version of NEMO by directly downloading this repo:
git clone https://github.com/pulp-platform/nemo
cd nemo
pip install -e .
Then, you can import it in your script using
import nemo
Example
- MNIST post-training quantization: https://colab.research.google.com/drive/1AmcITfN2ELQe07WKQ9szaxq-WSu4hdQb
Documentation
Full documentation for NEMO is under development (see doc
folder). You can find a technical report covering the deployment-aware quantization methodology here: https://arxiv.org/abs/2004.05930
License
NEMO is released under Apache 2.0, see the LICENSE file in the root of this repository for details.
Acknowledgements
NEMO is an outcome of the European Commission Horizon 2020 ALOHA Project, funded under the EU's Horizon 2020 Research and Innovation Programme, grant agreement no. 780788.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pytorch-nemo-0.0.5.tar.gz
.
File metadata
- Download URL: pytorch-nemo-0.0.5.tar.gz
- Upload date:
- Size: 38.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.8.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 35c4b1416e16760825da4792f97cd3b7bd1b77ca62382272af325ec723995cdb |
|
MD5 | c60661d1f4f6bc54d26ef181e28e0a83 |
|
BLAKE2b-256 | 4a8b8c35da6771dea8a4882242a0fd4817a2892cf93adfb026013fe0eff50b23 |
File details
Details for the file pytorch_nemo-0.0.5-py3-none-any.whl
.
File metadata
- Download URL: pytorch_nemo-0.0.5-py3-none-any.whl
- Upload date:
- Size: 55.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.8.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0a3a7d3a418b0ab1c3e4c07078bdb3bdf940f2cb0ff82554bf5aa7f2520bb9ec |
|
MD5 | fa505008590413df9fb2d229b9ce7e04 |
|
BLAKE2b-256 | 1cd79c2f6f1741ba9ea4331582bbeb3fbdb9ef77621db0b333f7dcbf077d5c5e |