Skip to main content

Simplification of pruned models for accelerated inference

Project description

Simplify

Simplification of pruned models for accelerated inference.

Installation

Simplify can be installed using pip:

pip3 install torch-simplify

or if you want to run the latest version of the code, you can install from git:

git clone https://github.com/EIDOSlab/simplify
cd simplify
pip3 install -r requirements.txt

Usage

Main function

For most scenarios the main simplify function will suffice. This function returns the simplified model.

Arguments

The expected arguments are:

  • model (torch.nn.Module): Module to be simplified i.e. the PyTorch's model.
  • x (torch.Tensor): zero-tensor of shape [1, C, N, M], same as the model usual input.
  • bn_folding (List): List of tuple (nn.Conv2d, nn.BatchNorm2d) to be fused. If None it tries to evaluate them given the model. Default None.
  • fuse_bn (bool): If True, fuse the conv-bn tuple.
  • pinned_out (List): List of nn.Modules which output needs to remain of the original shape (e.g. layers related to a residual connection with a sum operation).

Minimal working example

import torch
from torchvision import models
from simplify import simplify

model = models.resnet18()

# Apply some pruning strategy or load a pruned checkpoint

dummy_input = torch.zeros(1, 3, 224, 224)  # Tensor shape is that of a standard input for the given model
simplified_model = simplify(model, dummy_input)

Submodules

The simplify function is composed of three different submodules: fuse, propagate and remove. Each module can be used independently as needed.

fuse

Fuses adjacent Conv (or Linear) and BatchNorm layers.

propagate

Propagates non-zero bias of pruned neurons to remaining neurons of the next layers.

remove

Removes zeroed neurons from the architecture.


Citing

If you use this software for research or application purposes, please use the following citation:

@article{bragagnolo2021simplify,
  title = {Simplify: A Python library for optimizing pruned neural networks},
  journal = {SoftwareX},
  volume = {17},
  pages = {100907},
  year = {2022},
  issn = {2352-7110},
  doi = {https://doi.org/10.1016/j.softx.2021.100907},
  url = {https://www.sciencedirect.com/science/article/pii/S2352711021001576},
  author = {Andrea Bragagnolo and Carlo Alberto Barbano},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch-simplify-1.1.1.tar.gz (11.4 kB view details)

Uploaded Source

Built Distribution

torch_simplify-1.1.1-py3-none-any.whl (13.0 kB view details)

Uploaded Python 3

File details

Details for the file torch-simplify-1.1.1.tar.gz.

File metadata

  • Download URL: torch-simplify-1.1.1.tar.gz
  • Upload date:
  • Size: 11.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for torch-simplify-1.1.1.tar.gz
Algorithm Hash digest
SHA256 0286da68179f00e4ed32944fa50ee2d16dd5191ee22f19e576d7c11eb13100a4
MD5 89a6ca05a3026e95c52b8e89805af19f
BLAKE2b-256 b2e5b61bd807a2cdaa6607e0085052154eb4533680ab1458e6c2b3fcd47a2a56

See more details on using hashes here.

File details

Details for the file torch_simplify-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: torch_simplify-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 13.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for torch_simplify-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 59015bd527caaaba3ce307bd4106382b54c3fe2ae883c31f0a8a05c65e57d014
MD5 bb27fa043350dd2ac4dda8af754a43b4
BLAKE2b-256 616a04cdfb251a0e52a52bd28e336175ab5fed1a0869904c63111ab84200c946

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page