Skip to main content

Simplification of pruned models for accelerated inference

Project description

Simplify

Simplification of pruned models for accelerated inference.

Installation

Simplify can be installed using pip:

pip3 install torch-simplify

or if you want to run the latest version of the code, you can install from git:

git clone https://github.com/EIDOSlab/simplify
cd simplify
pip3 install -r requirements.txt

Usage

Main function

For most scenarios the main simplify function will suffice. This function returns the simplified model.

Arguments

The expected arguments are:

  • model (torch.nn.Module): Module to be simplified i.e. the PyTorch's model.
  • x (torch.Tensor): zero-tensor of shape [1, C, N, M], same as the model usual input.
  • bn_folding (List): List of tuple (nn.Conv2d, nn.BatchNorm2d) to be fused. If None it tries to evaluate them given the model. Default None.
  • fuse_bn (bool): If True, fuse the conv-bn tuple.
  • pinned_out (List): List of nn.Modules which output needs to remain of the original shape (e.g. layers related to a residual connection with a sum operation).

Minimal working example

import torch
from torchvision import models
from simplify import simplify

model = models.resnet18()

# Apply some pruning strategy or load a pruned checkpoint

dummy_input = torch.zeros(1, 3, 224, 224)  # Tensor shape is that of a standard input for the given model
simplified_model = simplify(model, dummy_input)

Submodules

The simplify function is composed of three different submodules: fuse, propagate and remove. Each module can be used independently as needed.

fuse

Fuses adjacent Conv (or Linear) and BatchNorm layers.

propagate

Propagates non-zero bias of pruned neurons to remaining neurons of the next layers.

remove

Removes zeroed neurons from the architecture.


Citing

If you use this software for research or application purposes, please use the following citation:

@article{bragagnolo2021simplify,
  title = {Simplify: A Python library for optimizing pruned neural networks},
  journal = {SoftwareX},
  volume = {17},
  pages = {100907},
  year = {2022},
  issn = {2352-7110},
  doi = {https://doi.org/10.1016/j.softx.2021.100907},
  url = {https://www.sciencedirect.com/science/article/pii/S2352711021001576},
  author = {Andrea Bragagnolo and Carlo Alberto Barbano},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch-simplify-1.1.2.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

torch_simplify-1.1.2-py3-none-any.whl (13.0 kB view details)

Uploaded Python 3

File details

Details for the file torch-simplify-1.1.2.tar.gz.

File metadata

  • Download URL: torch-simplify-1.1.2.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for torch-simplify-1.1.2.tar.gz
Algorithm Hash digest
SHA256 21a49117e9ce32c6030e3c7d473076dfb4b24b46c4cb41f5d5dcb496ea426888
MD5 2d6431a3ed0cab7504db90ab8c96c7ef
BLAKE2b-256 5a7c012f09777511db1a9a236d196ec9e071317a941d6c87b9f0b5acb4a538f7

See more details on using hashes here.

File details

Details for the file torch_simplify-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: torch_simplify-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 13.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for torch_simplify-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d2f420cc31f00d5186199b8f98d14f35547755551d47878d4c50dbf236879166
MD5 ca6193beea24908d38a2cca2ed4cfa64
BLAKE2b-256 51d9aed76d47826b28ee5c29a7a44db51bc6dd39906919e7d377e4b237292ef7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page