Skip to main content

Uncertainty quantification library in PyTorch

Project description

Torch Uncertainty Logo

pypi tests Docs Ruff Code Coverage Discord Badge

TorchUncertainty is a package designed to help you leverage uncertainty quantification techniques and make your deep neural networks more reliable. It aims at being collaborative and including as many methods as possible, so reach out to add yours!

:construction: TorchUncertainty is in early development :construction: - expect changes, but reach out and contribute if you are interested in the project! Please raise an issue if you have any bugs or difficulties and join the discord server.


This package provides a multi-level API, including:

  • ready-to-train baselines on research datasets, such as ImageNet and CIFAR
  • deep learning baselines available for training on your datasets
  • pretrained weights for these baselines on ImageNet and CIFAR (work in progress 🚧).
  • layers available for use in your networks
  • scikit-learn style post-processing methods such as Temperature Scaling

See the Reference page or the API reference for a more exhaustive list of the implemented methods, datasets, metrics, etc.

Installation

Install the desired PyTorch version in your environment. Then, install the package from PyPI:

pip install torch-uncertainty

If you aim to contribute, have a look at the contribution page.

Getting Started and Documentation

Please find the documentation at torch-uncertainty.github.io.

A quickstart is available at torch-uncertainty.github.io/quickstart.

Implemented methods

Baselines

To date, the following deep learning baselines have been implemented:

  • Deep Ensembles
  • MC-Dropout - Tutorial
  • BatchEnsemble
  • Masksembles
  • MIMO
  • Packed-Ensembles (see blog post) - Tutorial
  • Bayesian Neural Networks :construction: Work in progress :construction: - Tutorial
  • Regression with Beta Gaussian NLL Loss
  • Deep Evidential Classification & Regression - Tutorial

Augmentation methods

The following data augmentation methods have been implemented:

  • Mixup, MixupIO, RegMixup, WarpingMixup

Post-processing methods

To date, the following post-processing methods have been implemented:

  • Temperature, Vector, & Matrix scaling - Tutorial

Tutorials

We provide the following tutorials in our documentation:

Awesome Uncertainty repositories

You may find a lot of papers about modern uncertainty estimation techniques on the Awesome Uncertainty in Deep Learning.

Other References

This package also contains the official implementation of Packed-Ensembles.

If you find the corresponding models interesting, please consider citing our paper:

@inproceedings{laurent2023packed,
    title={Packed-Ensembles for Efficient Uncertainty Estimation},
    author={Laurent, Olivier and Lafage, Adrien and Tartaglione, Enzo and Daniel, Geoffrey and Martinez, Jean-Marc and Bursuc, Andrei and Franchi, Gianni},
    booktitle={ICLR},
    year={2023}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_uncertainty-0.1.5.tar.gz (494.3 kB view details)

Uploaded Source

Built Distribution

torch_uncertainty-0.1.5-py3-none-any.whl (146.0 kB view details)

Uploaded Python 3

File details

Details for the file torch_uncertainty-0.1.5.tar.gz.

File metadata

  • Download URL: torch_uncertainty-0.1.5.tar.gz
  • Upload date:
  • Size: 494.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for torch_uncertainty-0.1.5.tar.gz
Algorithm Hash digest
SHA256 dc87bbce36c246da5fb0145e1df87b5c2f913964e75e8177c2a16cb3a02b6206
MD5 f408f650f73cc3102ee4797882f67bb4
BLAKE2b-256 036ea542bcedd4092748077d29cd82521b3a11d5daadc76e4fc9648ddc70e932

See more details on using hashes here.

File details

Details for the file torch_uncertainty-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for torch_uncertainty-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 bf57eea7c1330ff2d64885d7ac4c6bd842adca5449f75216895de0321e2b501d
MD5 af9ef9ddb3d9eb91da8abd027baee186
BLAKE2b-256 e3b3f1544792e0f6b54d07bd08980b17fc4c965daa225bd90dee51c3c0b1fce9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page