Folds BN layers in tf keras models.
Project description
Batch-Normalization Folding
In this repository, we propose an implementation of the batch-normalization folding algorithm from IJCAI 2022. Batch-Normalization Folding consists in emoving batch-normalization layers without changing the predictive function defiend by the neural network. The simpliest scenario is an application for a fully-connected layer followed by a batch-normalization layer, we get
x \mapsto \gamma \frac{Ax + b - \mu}{\sigma + \epsilon} + \beta = \gamma \frac{A}{\sigma +\epsilon} x + \frac{b - \mu}{\sigma + \epsilon} + \beta
Thus the two layers can be expressed as a single fully-connected layer at inference without any change in the predictive function.
use
This repository is available as a pip package (use pip install tensorflow-batchnorm-folding
).
This implementation is compatible with tf.keras.Model instances. It was tested with the following models
- ResNet 50
- MobileNet V2
- MobileNet V3
- EfficentNet B0
To run a simple test:
from batch_normalization_folding.folder import fold_batchnormalization_layers
import tensorflow as tf
mod=tf.keras.applications.efficientnet.EfficientNetB0()
folded_model,output_str=fold_batchnormalization_layers(mod,True)
The output_str
is either the ratio num_layers_folded/num_layers_not_folded or 'failed' to state a failure in the process.
To Do
- unit test on all keras applciations models
- check package installement
- deal with Concatenate layers
cite
@inproceedings{yvinec2022fold,
title={To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding},
author={Yvinec, Edouard and Dapogny, Arnaud and Bailly, Kevin},
journal={IJCAI},
year={2022}
}
Performance on Base Models
+------------------------------------+
| ResNet 50 |
+------------------------------------+
| BN layers folded | 53 |
| BN layers not folded | 0 |
+------------------------------------+
| EfficientNet B0 |
+------------------------------------+
| BN layers folded | 49 |
| BN layers not folded | 0 |
+------------------------------------+
| MobileNet V2 |
+------------------------------------+
| BN layers folded | 52 |
| BN layers not folded | 0 |
+------------------------------------+
| MobileNet V3 |
+------------------------------------+
| BN layers folded | 34 |
| BN layers not folded | 0 |
+------------------------------------+
| Inception ResNet V2 |
+------------------------------------+
| BN layers folded | 204 |
| BN layers not folded | 0 |
+------------------------------------+
| Inception V3 |
+------------------------------------+
| BN layers folded | 94 |
| BN layers not folded | 0 |
+------------------------------------+
| NASNet |
+------------------------------------+
| BN layers folded | 28 |
| BN layers not folded | 164 |
+------------------------------------+
| DenseNet 121 |
+------------------------------------+
| BN layers folded | 59 |
| BN layers not folded | 62 |
+------------------------------------+
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for tensorflow_batchnorm_folding-1.0.8.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 24294ef5f8116adf1d6e3494e5256fd9611213e969bffd196556aa97f2a097fc |
|
MD5 | 9e3e17de06d5876f539825b2556bb663 |
|
BLAKE2b-256 | 3b2907c132792143d78a9f739b38f38706c3ede2fd7f79d4921c62dfdb2d8e23 |
Hashes for tensorflow_batchnorm_folding-1.0.8-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8a4b2422af5b5d580033e294458e0886c7d791930ae878e2aed66d47a888726f |
|
MD5 | 3b2744528859d1688f6ca9a7c899c887 |
|
BLAKE2b-256 | ff728b126cacfc5012f23fed3f2f266be0869f373d9b0ddcc9dc125ccefdd438 |