Skip to main content

A Model Compression Toolkit for neural networks

Project description

Model Compression Toolkit (MCT)

tests

Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware.
This project provides researchers, developers and engineers tools for optimizing and deploying state-of-the-art neural network on efficient hardware.
Specifically, this project applies constrained quantization and pruning schemes on a neural network.

Currently, this project only supports hardware friendly post training quantization (HPTQ) with Tensorflow 2 [1].

The MCT project is developed by researchers and engineers working at Sony Semiconductors Israel.

For more information, please visit our project website.

Table of Contents

Getting Started

This section provides a quick starting guide. We begin with installtion via source code or pip server. Then, we provide a short usage example.

Installation

See the MCT install guide for the pip package, and build from source.

From Source

git clone https://github.com/sony/model_optimization.git
python setup.py install

From PyPi - latest stable release

pip install model-compression-toolkit

A nightly package is also available (unstable):

pip install mct-nightly

Usage Example

An example of how to use the post training quantization, using Keras, is shown in the following code snapshot.

import model_compression_toolkit as mct

# Set the batch size of the images at each calibration iteration.
batch_size = 50

# Create a representative data generator, which returns a list of images.
# Load a folder of images. 
folder = '/path/to/images/folder'

# The images can be preprocessed using a list of preprocessing functions.
def normalization(x):
    return (x - 127.5) / 127.5

# Create a FolderImageLoader instance which loads the images, preprocess them and enables you to sample batches of them.
image_data_loader = mct.FolderImageLoader(folder,
                                          preprocessing=[normalization],
                                          batch_size=batch_size)

# Create a Callable representative dataset for calibration purposes.
# The function should be called without any arguments, and should return a list numpy arrays (array for each model's input).
# For example: if the model has two input tensors - one with input shape of 32X32X3 and the second with input 
# shape of 224X224X3, and we calibrate the model using batches of 20 images,
# calling representative_data_gen() should return a list 
# of two numpy.ndarray objects where the arrays' shapes are [(20, 32, 32, 3), (20, 224, 224, 3)].
def representative_data_gen() -> list:
        return [image_data_loader.sample()]


# Create a model and quantize it using the representative_data_gen as the calibration images.
# Set the number of calibration iterations to 10.
quantized_model, quantization_info = mct.keras_post_training_quantization(model,
                                                                          representative_data_gen,
                                                                          n_iter=10)

For more examples please see the tutorials' directory.

Supported Features

Quantization:

* Post Training Quantization 
* Gradient based post training (Experimental) 

Tensorboard Visualization (Experimental):

* CS Analyizer: compare a model compressed with the orignal model to analyze large accuracy drops.
* Activation statisicis and errors

MCT is tested with Tensorflow Version 2.7.

Tutorials and Results

As part of the MCT library, we have a set of example networks on image classification. These networks can be used as examples when using the package.

  • Image Classification Example with MobileNet V1 on ImageNet dataset
Network Name Float Accuracy 8Bit Accuracy Comments
MobileNetV1 [2] 70.558 70.418

For more results please see [1]

Contributions

MCT aims at keeping a more up-to-date fork and welcomes contributions from anyone.

*You will find more information about contributions in the Contribution guide.

License

Apache License 2.0.

Refernce

[1] Habi, H.V., Peretz, R., Cohen, E., Dikstein, L., Dror, O., Diamant, I., Jennings, R.H. and Netzer, A., 2021. HPTQ: Hardware-Friendly Post Training Quantization. arXiv preprint.

[2] MobilNet from Keras applications.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mct-nightly-1.1.0.11012022.post2648.tar.gz (124.5 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file mct-nightly-1.1.0.11012022.post2648.tar.gz.

File metadata

  • Download URL: mct-nightly-1.1.0.11012022.post2648.tar.gz
  • Upload date:
  • Size: 124.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for mct-nightly-1.1.0.11012022.post2648.tar.gz
Algorithm Hash digest
SHA256 6b3e8647b2bc715eac34005f76e3233ddcf20629af792408c72e779e62ee61fb
MD5 18e85321c201111ffee31a64d474b242
BLAKE2b-256 15d52725d5da0ebdd72231f3e9412407c60415622c45d1d80a7eb2732d5c7120

See more details on using hashes here.

File details

Details for the file mct_nightly-1.1.0.11012022.post2648-py3-none-any.whl.

File metadata

  • Download URL: mct_nightly-1.1.0.11012022.post2648-py3-none-any.whl
  • Upload date:
  • Size: 239.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for mct_nightly-1.1.0.11012022.post2648-py3-none-any.whl
Algorithm Hash digest
SHA256 e15babcb71c9c4c79f9821b49911db21455478c32502d3c58ad0f9cbde4666b9
MD5 428cd054a4e4493694db35291a55e1a6
BLAKE2b-256 a1bd935120bb6566e5b2686472bf66f8c161cb5d18193e7278ef91a728b87053

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page