Skip to main content

A Model Compression Toolkit for neural networks

Project description

Model Compression Toolkit (MCT)

tests

Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware.
This project provides researchers, developers and engineers tools for optimizing and deploying state-of-the-art neural network on efficient hardware.
Specifically, this project applies constrained quantization and pruning schemes on a neural network.

Currently, this project only supports hardware friendly post training quantization (HPTQ) with Tensorflow 2 [1].

The MCT project is developed by researchers and engineers working at Sony Semiconductors Israel.

For more information, please visit our project website.

Table of Contents

Getting Started

This section provides a quick starting guide. We begin with installtion via source code or pip server. Then, we provide a short usage example.

Installation

See the MCT install guide for the pip package, and build from source.

From Source

git clone https://github.com/sony/model_optimization.git
python setup.py install

From PyPi - latest stable release

pip install model-compression-toolkit

A nightly package is also available (unstable):

pip install mct-nightly

To run MCT, one of the supported frameworks, Tenosflow/Pytorch, needs to be installed.

For using with Tensorflow please install the packages: tensorflow, tensorflow-model-optimization

For using with Pytorch (experimental) please install the packages: torch

MCT is tested with:

  • Tensorflow version 2.7
  • Pytorch version 1.10.0

Usage Example

For an example of how to use the post training quantization, using Keras, please use this link.

For an example using Pytorch (experimental), please use this link.

For more examples please see the tutorials' directory.

Supported Features

Quantization:

* Post Training Quantization for Keras models.
* Post Training Quantization for Pytorch models (experimental).
* Gradient based post training (Experimental, Keras only).
* Mixed-precision post training quantization (Experimental, Keras only).

Tensorboard Visualization (Experimental):

* CS Analyizer: compare a model compressed with the orignal model to analyze large accuracy drops.
* Activation statisicis and errors

Results

Keras

As part of the MCT library, we have a set of example networks on image classification. These networks can be used as examples when using the package.

  • Image Classification Example with MobileNet V1 on ImageNet dataset
Network Name Float Accuracy 8Bit Accuracy Comments
MobileNetV1 [2] 70.558 70.418

For more results please see [1]

Pytorch

We quantized classification networks from the torchvision library. In the following table we present the ImageNet validation results for these models:

Network Name Float Accuracy 8Bit Accuracy
MobileNet V2 [3] 71.886 71.444
ResNet-18 [3] 69.86 69.63
SqueezeNet 1.1 [3] 58.128 57.678

Contributions

MCT aims at keeping a more up-to-date fork and welcomes contributions from anyone.

*You will find more information about contributions in the Contribution guide.

License

Apache License 2.0.

References

[1] Habi, H.V., Peretz, R., Cohen, E., Dikstein, L., Dror, O., Diamant, I., Jennings, R.H. and Netzer, A., 2021. HPTQ: Hardware-Friendly Post Training Quantization. arXiv preprint.

[2] MobilNet from Keras applications.

[3] TORCHVISION.MODELS

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mct-nightly-1.2.0.4032022.post339.tar.gz (155.5 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file mct-nightly-1.2.0.4032022.post339.tar.gz.

File metadata

  • Download URL: mct-nightly-1.2.0.4032022.post339.tar.gz
  • Upload date:
  • Size: 155.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.2 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.8.12

File hashes

Hashes for mct-nightly-1.2.0.4032022.post339.tar.gz
Algorithm Hash digest
SHA256 8b5e7b9b31f31e4d064a7d038a448e00c395d51ba9675e69a2b0daa5299e38d5
MD5 83f4f4a2015dabf0ab2a35cfde0d75a9
BLAKE2b-256 c4f9c45ae5d989c98661eacaed121f3d079af86b63d95046d9998e4593790b9e

See more details on using hashes here.

File details

Details for the file mct_nightly-1.2.0.4032022.post339-py3-none-any.whl.

File metadata

  • Download URL: mct_nightly-1.2.0.4032022.post339-py3-none-any.whl
  • Upload date:
  • Size: 289.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.2 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.8.12

File hashes

Hashes for mct_nightly-1.2.0.4032022.post339-py3-none-any.whl
Algorithm Hash digest
SHA256 e88017b8593dd3891d71f0a7bd300edb60fe17403dd1cc0210396e253c55fa36
MD5 a1b1737ca3736449b996ca9b7f6be78b
BLAKE2b-256 29c2ffdb756b963fe0e2fbb392d929a9f223f5b7df1d54fb5ab62bbcf89ced7e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page