Folds BN layers in tf keras models.
Project description
Batch-Normalization Folding
In this repository, we propose an implementation of the batch-normalization folding algorithm from IJCAI 2022. Batch-Normalization Folding consists in emoving batch-normalization layers without changing the predictive function defiend by the neural network. The simpliest scenario is an application for a fully-connected layer followed by a batch-normalization layer, we get
x \mapsto \gamma \frac{Ax + b - \mu}{\sigma + \epsilon} + \beta = \gamma \frac{A}{\sigma +\epsilon} x + \frac{b - \mu}{\sigma + \epsilon} + \beta
Thus the two layers can be expressed as a single fully-connected layer at inference without any change in the predictive function.
use
This repository is available as a pip package (use pip install tensorflow-batchnorm-folding
).
This implementation is compatible with tf.keras.Model instances. It was tested with the following models
- ResNet 50
- MobileNet V2
- MobileNet V3
- EfficentNet B0
To run a simple test:
from batch_normalization_folding.folder import fold_batchnormalization_layers
import tensorflow as tf
mod=tf.keras.applications.efficientnet.EfficientNetB0()
folded_model,output_str=fold_batchnormalization_layers(mod,True)
The output_str
is either the ratio num_layers_folded/num_layers_not_folded or 'failed' to state a failure in the process.
To Do
- unit test on all keras applciations models
- check package installement
- deal with Concatenate layers
cite
@inproceedings{yvinec2022fold,
title={To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding},
author={Yvinec, Edouard and Dapogny, Arnaud and Bailly, Kevin},
journal={IJCAI},
year={2022}
}
Performance on Base Models
+------------------------------------+
| ResNet 50 |
+------------------------------------+
| BN layers folded | 53 |
| BN layers not folded | 0 |
+------------------------------------+
| EfficientNet B0 |
+------------------------------------+
| BN layers folded | 49 |
| BN layers not folded | 0 |
+------------------------------------+
| MobileNet V2 |
+------------------------------------+
| BN layers folded | 52 |
| BN layers not folded | 0 |
+------------------------------------+
| MobileNet V3 |
+------------------------------------+
| BN layers folded | 34 |
| BN layers not folded | 0 |
+------------------------------------+
| Inception ResNet V2 |
+------------------------------------+
| BN layers folded | 204 |
| BN layers not folded | 0 |
+------------------------------------+
| Inception V3 |
+------------------------------------+
| BN layers folded | 94 |
| BN layers not folded | 0 |
+------------------------------------+
| NASNet |
+------------------------------------+
| BN layers folded | 28 |
| BN layers not folded | 164 |
+------------------------------------+
| DenseNet 121 |
+------------------------------------+
| BN layers folded | 59 |
| BN layers not folded | 62 |
+------------------------------------+
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for tensorflow_batchnorm_folding-1.0.4.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8fd4cea9beb4341810b6b40b41123bb707d3b129c67b348824120dc7c642c016 |
|
MD5 | b994eabf8c5e373806e42b5a5ced325f |
|
BLAKE2b-256 | dd5cab6eb63f7bf2076c0bc93f6fdc4111b4a059a5123cb3c983146f0c32f51d |
Hashes for tensorflow_batchnorm_folding-1.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 652865c12563b986e2d48e637eefcd9e963c53cbc897d7422f095ed7400f594b |
|
MD5 | a12a2902726e1d2eadf7c121a83ec118 |
|
BLAKE2b-256 | 1fea8f055432e3c2114132c09f900c9cec346c540fd26770c78ad44824445118 |