A Model Compression Toolkit for neural networks
Project description
Model Compression Toolkit (MCT)
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware.
This project provides researchers, developers and engineers tools for optimizing and deploying state-of-the-art neural network on efficient hardware.
Specifically, this project applies constrained quantization and pruning schemes on a neural network.
Currently, this project only supports hardware friendly post training quantization (HPTQ) with Tensorflow 2 [1].
The MCT project is developed by researchers and engineers working at Sony Semiconductors Israel.
For more information, please visit our project website.
Table of Contents
Getting Started
This section provides a quick starting guide. We begin with installtion via source code or pip server. Then, we provide a short usage example.
Installation
See the MCT install guide for the pip package, and build from source.
From Source
git clone https://github.com/sony/model_optimization.git
python setup.py install
From PyPi - latest stable release
pip install model-compression-toolkit
A nightly package is also available (unstable):
pip install mct-nightly
To run MCT, one of the supported frameworks, Tenosflow/Pytorch, needs to be installed.
For using with Tensorflow please install the packages: tensorflow, tensorflow-model-optimization
For using with Pytorch (experimental) please install the packages: torch
MCT is tested with:
- Tensorflow version 2.7
- Pytorch version 1.10.0
Usage Example
For an example of how to use the post training quantization, using Keras, please use this link.
For an example using Pytorch (experimental), please use this link.
For more examples please see the tutorials' directory.
Supported Features
Quantization:
* Post Training Quantization for Keras models.
* Post Training Quantization for Pytorch models (experimental).
* Gradient based post training (Experimental, Keras only).
* Mixed-precision post training quantization (Experimental).
Tensorboard Visualization (Experimental):
* CS Analyizer: compare a model compressed with the orignal model to analyze large accuracy drops.
* Activation statisicis and errors
Results
Keras
As part of the MCT library, we have a set of example networks on image classification. These networks can be used as examples when using the package.
- Image Classification Example with MobileNet V1 on ImageNet dataset
Network Name | Float Accuracy | 8Bit Accuracy | Comments |
---|---|---|---|
MobileNetV1 [2] | 70.558 | 70.418 |
For more results please see [1]
Pytorch
We quantized classification networks from the torchvision library. In the following table we present the ImageNet validation results for these models:
Network Name | Float Accuracy | 8Bit Accuracy |
---|---|---|
MobileNet V2 [3] | 71.886 | 71.444 |
ResNet-18 [3] | 69.86 | 69.63 |
SqueezeNet 1.1 [3] | 58.128 | 57.678 |
Contributions
MCT aims at keeping a more up-to-date fork and welcomes contributions from anyone.
*You will find more information about contributions in the Contribution guide.
License
References
[1] Habi, H.V., Peretz, R., Cohen, E., Dikstein, L., Dror, O., Diamant, I., Jennings, R.H. and Netzer, A., 2021. HPTQ: Hardware-Friendly Post Training Quantization. arXiv preprint.
[2] MobilNet from Keras applications.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mct-nightly-1.3.0.5042022.post336.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | ec3788169d22f49cf1b0ce5d4574de29fed68c458ab186619d4b9882a6dd9ac0 |
|
MD5 | e444ce060850bc284ab56d960d29f7a8 |
|
BLAKE2b-256 | 4db30e2b060450837c3d95067ab359ceddd7f0b021c509ded7c82ad8509844af |
Hashes for mct_nightly-1.3.0.5042022.post336-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8ec59671c5f0c1462195592b68972148922b387cec5098d67f06e1b3d19edc79 |
|
MD5 | 15ae60bd52330b49b2b366abf7a35d97 |
|
BLAKE2b-256 | 441a8cc3d9e8d5cde919cadc2c2a554981a69d2affa722777ae6b7cba7d4ca07 |