Skip to main content

Assumed Density Filtering (ADF) Probabilistic Networks

Project description

Documentation Status CI Status Code Coverage Code Style: Black

torch-adf provides implementations for probabilistic PyTorch neural network layers, which are based on assumed density filtering. Assumed density filtering (ADF) is a general concept from Bayesian inference, but in the case of feed-forward neural networks that we consider here it is a way to approximately propagate a random distribution through the neural network.

The layers in this package have the same names and arguments as their corresponding PyTorch versions. We use Gaussian distributions for our ADF approximations, which are described by their means and (co-)variances. So unlike the standard PyTorch layers, each torch-adf layer takes two inputs and produces two outputs (one for the means and one for the (co-)variances).

torch-adf layers can be used exactly like the corresponding PyTorch layers within a model. However, as mentioned above, ADF layers take two inputs and produce two outputs instead of one, so it is not possible to simply mix ADF and standard layers within the same model.

from torch.nn import Sequential
from torchadf.nn import Linear

in_dim, out_dim = 64, 32
adflayer = Linear(in_dim, out_dim)
model = Sequential(adflayer)

The Overview and Examples sections of our documentation provide more realistic and complete examples.

Project Information

torch-adf is released under the MIT license, its documentation lives at Read the Docs, the code on GitHub, and the latest release can be found on PyPI. It’s tested on Python 3.6+.

If you’d like to contribute to torch-adf you’re most welcome. We have written a short guide to help you get you started!

Further Reading

Additional information on the algorithmic aspects of torch-adf can be found in the following works:

  • Jochen Gast, Stefan Roth, “Lightweight Probabilistic Deep Networks”, 2018

  • Jan Macdonald, Stephan Wäldchen, Sascha Hauch, Gitta Kutyniok, “A Rate-Distortion Framework for Explaining Neural Network Decisions”, 2019

Acknowledgments

During the setup of this project we were heavily influenced and inspired by the works of Hynek Schlawack and in particular his attrs package and blog posts on testing and packaing and deploying to PyPI. Thank you for sharing your experiences and insights.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch-adf-22.1.0.tar.gz (36.2 kB view details)

Uploaded Source

Built Distribution

torch_adf-22.1.0-py2.py3-none-any.whl (15.1 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file torch-adf-22.1.0.tar.gz.

File metadata

  • Download URL: torch-adf-22.1.0.tar.gz
  • Upload date:
  • Size: 36.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.13

File hashes

Hashes for torch-adf-22.1.0.tar.gz
Algorithm Hash digest
SHA256 f73ff3a7a52f16fbeafe978908296aff4e32ffbf4b4f3423e9c8929b029b7ae7
MD5 ffcda3aa4f76e2e778ea118aa7149967
BLAKE2b-256 a0bab126d3992a5c5a31dd1fb2395646857edad5f6264b6ee7e556049f82cc29

See more details on using hashes here.

File details

Details for the file torch_adf-22.1.0-py2.py3-none-any.whl.

File metadata

  • Download URL: torch_adf-22.1.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 15.1 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.13

File hashes

Hashes for torch_adf-22.1.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f7e99198f4cb069635fc342f4d109a2340cf4887dfe78d11bf75242ffd166685
MD5 68677e4fab8eb6aa13845157f5aeead9
BLAKE2b-256 5a8443a6acfa93ed7ce8256300a3d2a11b8e74308c3345c5665c2d069cc2ba83

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page