Skip to main content

A library for PyTorch model convolution quality analysis.

Project description

torchconvquality

Latest Version pytest CC BY-SA 4.0

A library for PyTorch model convolution quality analysis.

Installation

To install published releases from PyPi execute:

pip install torchconvquality

To update torchconvquality to the latest available version, add the --upgrade flag to the above commands.

If you want the latest (potentially unstable) features you can also directly install from the github main branch:

pip install git+https://github.com/paulgavrikov/torchconvquality

Usage

Just import the package and run measure_quality on your model. This function will return a dict with quality metrics for every 2D-Conv-Layer with 3x3 kernel. Note: theoretically, we could also extend this to large kernel sites. Let us know through a GitHub issue if you are interested in that beeing implemented.

from torchconvquality import measure_quality

model = ... # your model
quality_dict = measure_quality(model)

Here is an example output (pretrained ResNet-18 on ImageNet):

{'layer1.0.conv1': {'sparsity': 0.125244140625,
                    'variance_entropy': 0.8243831176467854},
 'layer1.0.conv2': {'sparsity': 0.0, 'variance_entropy': 0.8540944028708247},
 'layer1.1.conv1': {'sparsity': 0.0, 'variance_entropy': 0.880116579714338},
 'layer1.1.conv2': {'sparsity': 0.0, 'variance_entropy': 0.8770092802517852},
 'layer2.0.conv1': {'sparsity': 0.0, 'variance_entropy': 0.9162120601419921},
 'layer2.0.conv2': {'sparsity': 0.0, 'variance_entropy': 0.79917093039702},
 'layer2.1.conv1': {'sparsity': 0.0, 'variance_entropy': 0.8988180721697099},
 'layer2.1.conv2': {'sparsity': 0.0, 'variance_entropy': 0.8584897149301801},
 'layer3.0.conv1': {'sparsity': 0.0, 'variance_entropy': 0.589569852560285},
 'layer3.0.conv2': {'sparsity': 0.0, 'variance_entropy': 0.7655632562758724},
 'layer3.1.conv1': {'sparsity': 0.0, 'variance_entropy': 0.8485658915907506},
 'layer3.1.conv2': {'sparsity': 1.52587890625e-05,
                    'variance_entropy': 0.7960795856993427},
 'layer4.0.conv1': {'sparsity': 7.62939453125e-06,
                    'variance_entropy': 0.6701797219658017},
 'layer4.0.conv2': {'sparsity': 7.62939453125e-06,
                    'variance_entropy': 0.8185696588740375},
 'layer4.1.conv1': {'sparsity': 0.0, 'variance_entropy': 0.6583874160290571},
 'layer4.1.conv2': {'sparsity': 0.001796722412109375,
                    'variance_entropy': 0.21928562164990348}}

Supported Metrics

Sparsity

Sparsity measures the ratio of 2D Filters with a $l_\infty$-norm that is lower than sparsity_eps (default: 1%) of the highest norm in that layer. These filters will most likely not contribute to your learned function beyond noise. You should minimize this value if you are interested in exploiting all of your available model capacity. On the other hand, larger sparsity values allow you to successfully prune many weights.

Variance Entropy

Variance Entropy captures the difference in filter patterns in your conv layer. We have observed that significantly overparameterized networks learn many redundand filters in deeper layers. Hence we assume that, generally, you'd like to increase diversity. A good value is somewhere around 0.9 - this means that the layer in question has learned a filter distribution that is signifincantly different from random. A value close to 0 indicates highly redudand filters. A value over 1 indicates a random distribution that you'd find prior to any training (i.e. right after initialization) or in GAN-Discriminator at the end of training (when it can no longer distinguish between real and fake inputs).

Variance Entropy Clean

Variance Entropy Clean is just Variance Entropy applied on pruned layer weights (as defined by sparsity_eps).

Citation

Please consider citing our publication if this libary was helpfull to you.

@InProceedings{Gavrikov_2022_CVPR,
    author    = {Gavrikov, Paul and Keuper, Janis},
    title     = {CNN Filter DB: An Empirical Investigation of Trained Convolutional Filters},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2022},
    pages     = {19066-19076}
}

Legal

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Funded by the Ministry for Science, Research and Arts, Baden-Wuerttemberg, Germany Grant 32-7545.20/45/1 (Q-AMeLiA).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchconvquality-0.3.0.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

torchconvquality-0.3.0-py3-none-any.whl (12.3 kB view details)

Uploaded Python 3

File details

Details for the file torchconvquality-0.3.0.tar.gz.

File metadata

  • Download URL: torchconvquality-0.3.0.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for torchconvquality-0.3.0.tar.gz
Algorithm Hash digest
SHA256 8468311ff7dc0ac7345da772a4df39f1ae4f3a85da46600f425e6cc68ca4397a
MD5 18652fc42a9dce492590f240c887aeaa
BLAKE2b-256 bf5648eb34de83ca5dd3ba3d04e9a71f4c6a8acab3929c39584b6282e32f5f3f

See more details on using hashes here.

File details

Details for the file torchconvquality-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for torchconvquality-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6cb93e94f21eaaf9cbb9bac98ccf68b78b182bd8c1497e5926de00976130abd0
MD5 d7a3fee9c4eefb3b6d7518b388ab0ce8
BLAKE2b-256 9fcb3a0f496ffaf74490a2dff2c92293206d5a76f8d7d6c2f95b8a8afe33e6a2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page