Folds BN layers in tf keras models.
Project description
Batch-Normalization Folding
In this repository, we propose an implementation of the batch-normalization folding algorithm from IJCAI 2022. Batch-Normalization Folding consists in emoving batch-normalization layers without changing the predictive function defiend by the neural network. The simpliest scenario is an application for a fully-connected layer followed by a batch-normalization layer, we get
x \mapsto \gamma \frac{Ax + b - \mu}{\sigma + \epsilon} + \beta = \gamma \frac{A}{\sigma +\epsilon} x + \frac{b - \mu}{\sigma + \epsilon} + \beta
Thus the two layers can be expressed as a single fully-connected layer at inference without any change in the predictive function.
use
This repository is available as a pip package (use pip install tensorflow-batchnorm-folding
).
This implementation is compatible with tf.keras.Model instances. It was tested with the following models
- ResNet 50
- MobileNet V2
- MobileNet V3
- EfficentNet B0
To run a simple test:
from batch_normalization_folding.folder import fold_batchnormalization_layers
import tensorflow as tf
mod=tf.keras.applications.efficientnet.EfficientNetB0()
folded_model,output_str=fold_batchnormalization_layers(mod,True)
The output_str
is either the ratio num_layers_folded/num_layers_not_folded or 'failed' to state a failure in the process.
To Do
- unit test on all keras applciations models
- check package installement
- deal with Concatenate layers
cite
@inproceedings{yvinec2022fold,
title={To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding},
author={Yvinec, Edouard and Dapogny, Arnaud and Bailly, Kevin},
journal={IJCAI},
year={2022}
}
Performance on Base Models
+------------------------------------+
| ResNet 50 |
+------------------------------------+
| BN layers folded | 53 |
| BN layers not folded | 0 |
+------------------------------------+
| EfficientNet B0 |
+------------------------------------+
| BN layers folded | 49 |
| BN layers not folded | 0 |
+------------------------------------+
| MobileNet V2 |
+------------------------------------+
| BN layers folded | 52 |
| BN layers not folded | 0 |
+------------------------------------+
| MobileNet V3 |
+------------------------------------+
| BN layers folded | 34 |
| BN layers not folded | 0 |
+------------------------------------+
| Inception ResNet V2 |
+------------------------------------+
| BN layers folded | 204 |
| BN layers not folded | 0 |
+------------------------------------+
| Inception V3 |
+------------------------------------+
| BN layers folded | 94 |
| BN layers not folded | 0 |
+------------------------------------+
| NASNet |
+------------------------------------+
| BN layers folded | 28 |
| BN layers not folded | 164 |
+------------------------------------+
| DenseNet 121 |
+------------------------------------+
| BN layers folded | 59 |
| BN layers not folded | 62 |
+------------------------------------+
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for tensorflow_batchnorm_folding-1.0.3.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4cb4267c7499ee1b4fc75a76d43e0cb9946f900f757a850d26e8d6909b4271a6 |
|
MD5 | 9013051e17cca881d573028412e4845d |
|
BLAKE2b-256 | 8ebb97d4f953f1546810b8a5ec1de59d04a7ad94aea2baaaa307f99477ffa471 |
Hashes for tensorflow_batchnorm_folding-1.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1741834bf063c25fd60578e62ee300156b33bc6cf054d90c47e0c4d94b1a2454 |
|
MD5 | ed937c4e1a00642e376a50344362ec88 |
|
BLAKE2b-256 | 1fe85d2d9d76a93214866e4954afc311cc0797f8739725170341577d99389252 |