Skip to main content

NEural Minimizer for pytOrch

Project description

NEMO (NEural Minimizer for pytOrch)

NEMO (NEural Minimizer for pytOrch) is a small library for minimization of Deep Neural Networks developed in PyTorch, aimed at their deployment on ultra-low power, highly memory constrained platforms, in particular (but not exclusively) PULP-based microcontrollers. NEMO features include:

  • deployment-related transformations such as BatchNorm folding, bias removal, weight equalization
  • collection of statistics on activations and weights
  • post-training quantization
  • quantization-aware fine-tuning, with partially automated precision relaxation
  • mixed-precision quantization
  • bit-accurate deployment model
  • export to ONNX

NEMO operates on three different "levels" of quantization-aware DNN representations, all built upon torch.nn.Module and torch.autograd.Function:

  • fake-quantized FQ: replaces regular activations (e.g., ReLU) with quantization-aware ones (PACT) and dynamically quantized weights (with linear PACT-like quantization), maintaining full trainability (similar to the native PyTorch support, but not based on it).
  • quantized-deployable QD: replaces all function with deployment-equivalent versions, trading off trainability for a more accurate representation of numerical behavior on real hardware.
  • integer-deployable ID: replaces all activation and weight tensors used along the network with integer-based ones. It aims at bit-accurate representation of actual hardware behavior. All the quantized representations support mixed-precision weights (signed and asymmetric) and activations (unsigned). The current version of NEMO targets per-layer quantization; work on per-channel quantization is in progress.

NEMO is organized as a Python library that can be applied with relatively small changes to an existing PyTorch based script or training framework.

Example

License

NEMO is released under Apache 2.0, see the LICENSE file in the root of this repository for details.

Requirements

The NEMO library (NEural Minimizer for tOrch) currently supports PyTorch >= 1.3.

Acknowledgements

ALOHA Logo

NEMO is an outcome of the European Commission Horizon 2020 ALOHA Project, funded under the EU's Horizon 2020 Research and Innovation Programme, grant agreement no. 780788.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-nemo-0.0.1.tar.gz (35.2 kB view details)

Uploaded Source

Built Distribution

pytorch_nemo-0.0.1-py3-none-any.whl (51.6 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-nemo-0.0.1.tar.gz.

File metadata

  • Download URL: pytorch-nemo-0.0.1.tar.gz
  • Upload date:
  • Size: 35.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for pytorch-nemo-0.0.1.tar.gz
Algorithm Hash digest
SHA256 d102a1a0eedec5fd6fb4caada857d82424edd851e95e7a4cdb6fa5b3a4769778
MD5 cb18fe97e0f35c5c407be23d82732dd5
BLAKE2b-256 1809bef89e286b73409560c96ed1557ec04d5412b450c27353d306173a12bd06

See more details on using hashes here.

File details

Details for the file pytorch_nemo-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: pytorch_nemo-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 51.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for pytorch_nemo-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 125eb7c092ddbca9731fb1af8496b8f1e6be99cd5e23b52b91158033928a511e
MD5 f54690db5f70ae808e73cb090cc222ba
BLAKE2b-256 1e4c8cb1288ca80028e56919bea88b31365e57ce4e26ea3e9421612a27e3ffe7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page