Folds BN layers in tf keras models.
Project description
Batch-Normalization Folding
In this repository, we propose an implementation of the batch-normalization folding algorithm from IJCAI 2022. Batch-Normalization Folding consists in emoving batch-normalization layers without changing the predictive function defiend by the neural network. The simpliest scenario is an application for a fully-connected layer followed by a batch-normalization layer, we get
x \mapsto \gamma \frac{Ax + b - \mu}{\sigma + \epsilon} + \beta = \gamma \frac{A}{\sigma +\epsilon} x + \frac{b - \mu}{\sigma + \epsilon} + \beta
Thus the two layers can be expressed as a single fully-connected layer at inference without any change in the predictive function.
use
This repository is available as a pip package (use pip install tensorflow-batchnorm-folding
).
This implementation is compatible with tf.keras.Model instances. It was tested with the following models
- ResNet 50
- MobileNet V2
- MobileNet V3
- EfficentNet B0
To run a simple test:
from batch_normalization_folding.folder import fold_batchnormalization_layers
import tensorflow as tf
mod=tf.keras.applications.efficientnet.EfficientNetB0()
folded_model,output_str=fold_batchnormalization_layers(mod,True)
The output_str
is either the ratio num_layers_folded/num_layers_not_folded or 'failed' to state a failure in the process.
To Do
- unit test on all keras applciations models
- check package installement
- deal with Concatenate layers
cite
@inproceedings{yvinec2022fold,
title={To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding},
author={Yvinec, Edouard and Dapogny, Arnaud and Bailly, Kevin},
journal={IJCAI},
year={2022}
}
Performance on Base Models
+------------------------------------+
| ResNet 50 |
+------------------------------------+
| BN layers folded | 53 |
| BN layers not folded | 0 |
+------------------------------------+
| EfficientNet B0 |
+------------------------------------+
| BN layers folded | 49 |
| BN layers not folded | 0 |
+------------------------------------+
| MobileNet V2 |
+------------------------------------+
| BN layers folded | 52 |
| BN layers not folded | 0 |
+------------------------------------+
| MobileNet V3 |
+------------------------------------+
| BN layers folded | 34 |
| BN layers not folded | 0 |
+------------------------------------+
| Inception ResNet V2 |
+------------------------------------+
| BN layers folded | 204 |
| BN layers not folded | 0 |
+------------------------------------+
| Inception V3 |
+------------------------------------+
| BN layers folded | 94 |
| BN layers not folded | 0 |
+------------------------------------+
| NASNet |
+------------------------------------+
| BN layers folded | 28 |
| BN layers not folded | 164 |
+------------------------------------+
| DenseNet 121 |
+------------------------------------+
| BN layers folded | 59 |
| BN layers not folded | 62 |
+------------------------------------+
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for tensorflow_batchnorm_folding-1.0.6.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | d7b7c469134ba7d19432814c605dc1657acc22cb087af2ff3d3cdda4d2539d22 |
|
MD5 | c5fa002f3dee26086c4a728e17094090 |
|
BLAKE2b-256 | 6e16696e720ebb30f849c9a50a603ecbe450dc0a95c0209e28c7fbf0ceda37d0 |
Hashes for tensorflow_batchnorm_folding-1.0.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3720f6ecfd93c3cd72dd06067aee3e172d4399617ed0e636c0288b66576ee814 |
|
MD5 | d1d1648fbbf317f62a79a4cd68c81397 |
|
BLAKE2b-256 | a653f08e366a1ce18c0a81a66bbd0e6bb63605428baf93c13ae3d7c7ca615b2c |