Skip to main content

Bayesiean Neural Network Pruning Library

Project description

Bayesian Neural Network Pruning



BPrune is developed to perform inference and pruning of Bayesian Neural Networks(BNN) models developed with Tensorflow and Tensorflow Probability. The BNN's supported by the package are one which uses mean field approximation principle of VI i.e uses gaussian to define the priors on the weights. Currently, the pruning threshold is based on the signal to noise ratio threshold.


  1. Library for performing inference for trained Bayesian Neural Network (BNN).
  2. Library for performing pruning trained Bayesian Neural Network(BNN).
  3. Supports Tensorflow and Tensorflow_probability based Bayesian Neural Network model architecture.
  4. Independent to the BNN's learning task, support BNN models for classification & regression.
  5. Capabilities of handling BNN's which are trained with distributed training libraries such as Horovod.

Installation Instructions:

  • Before installation ensure that you have a working Tensorflow and Tensorflow probability environment.
python3 install -r requirements.txt
python3 install 

If you are using a pip installation, simply do

python3 -m pip install BPrune
  • For development of the package following command can be used after git clone.
python3 develop

Quick Start Guide

  • Before running the model for inference or for pruning ensure that at the end of the training script details about the layer names and operations in the graph are written as text files.

  • To achieve this user can use the utility provided with BPrune named as Graph_Info_Writer.

  • The usage of the utility is described as follows:

    import numpy as np
    import tensorflow as tf
    import bprune.src.utils as UT
    # All the code for training the BNN
    # This path will be used as model_dir path in the argument when running BNN for inference
    case_dir = path/to/the/casefolder
  • For successful run of BPrune following files ('LayerNames.txt', 'Ops_name_BNN.txt') must be present in the case directory. The above described procedure will ensure these files are written at the end of the BNN training procedure.

  • Once the required text files are written at the end of training, BPrune can be used. The example use case can be found in example folder with the package.

  • The runtime arguments to a BPrune code can be provide using command-line or can be specified using a text file each line stating the argument. example:

    python @ArgFilePrune.txt


  • Only support models trained using tensorflow placeholders for feeding data to the graph.
  • Pruning Algorithm for models using other than Mean Field approximation functions for Variational Inference.
  • Unit-Test for the functionalities.


  • Bibtex Format(Arxiv):
     title={Bayesian Neural Networks at Scale: A Performance Analysis and Pruning Study},
     author={Sharma, Himanshu and Jennings, Elise},
     journal={arXiv preprint arXiv:2005.11619},
  • MLA Format (Arxiv):
      Sharma, Himanshu, and Elise Jennings. "Bayesian Neural Networks at Scale: A Performance Analysis and Pruning Study." arXiv preprint arXiv:2005.11619 (2020).



This research used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357. This research was funded in part and used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357. This paper describes objective technical results and analysis. Any subjective views or opinions that might be expressed in the paper do not necessarily represent the views of the U.S. DOE or the United States Government. Declaration of Interests - None.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

BPrune-0.0.1.tar.gz (12.6 kB view hashes)

Uploaded source

Built Distribution

BPrune-0.0.1-py3-none-any.whl (15.1 kB view hashes)

Uploaded py3

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Huawei Huawei PSF Sponsor Microsoft Microsoft PSF Sponsor NVIDIA NVIDIA PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page