Skip to main content

Uncertainty quantification in PyTorch

Project description

TorchUncertaintyLogo

pypi tests Docs PRWelcome Ruff Code Coverage Downloads Discord Badge

TorchUncertainty is a package designed to help leverage uncertainty quantification techniques to make deep neural networks more reliable. It aims at being collaborative and including as many methods as possible, so reach out to add yours!

:construction: TorchUncertainty is in early development :construction: - expect changes, but reach out and contribute if you are interested in the project! Please raise an issue if you have any bugs or difficulties and join the discord server.

:books: Our webpage and documentation is available here: torch-uncertainty.github.io. :books:

TorchUncertainty contains the official implementations of multiple papers from major machine-learning and computer vision conferences and was featured in tutorials at WACV 2024, HAICON 2024 and ECCV 2024.

Torch-Uncertainty is published at NeurIPS D&B 2025. Please consider citing the paper if the framework is helpful for your research.


This package provides a multi-level API, including:

  • easy-to-use :zap: lightning uncertainty-aware training & evaluation routines for 4 tasks: classification, probabilistic and pointwise regression, and segmentation.
  • fully automated evaluation of the performance of models with proper scores, selective classification, out-of-distribution detection and distribution shift performance metrics!
  • ready-to-train baselines on research datasets, such as ImageNet and CIFAR
  • layers, models, metrics, & losses available for your networks
  • scikit-learn style post-processing methods such as Temperature Scaling.
  • transformations and augmentations, including corruptions resulting in additional "corrupted datasets" available on HuggingFace

Have a look at the Reference page or the API reference for a more exhaustive list of the implemented methods, datasets, metrics, etc.

:gear: Installation

TorchUncertainty requires Python 3.10 or greater. Install the desired PyTorch version in your environment. Then, install the package from PyPI:

pip install torch-uncertainty

The installation procedure for contributors is different: have a look at the contribution page.

:whale: Docker image for contributors

For contributors running experiments on cloud GPU instances, we provide a pre-built Docker image that includes all necessary dependencies and configurations and the Dockerfile for building your custom Docker images. This allows you to quickly launch an experiment-ready container with minimal setup. Please refer to DOCKER.md for further details.

:racehorse: Quickstart

We make a quickstart available at torch-uncertainty.github.io/quickstart.

:books: Implemented methods

TorchUncertainty currently supports classification, probabilistic and pointwise regression, segmentation and pixelwise regression (such as monocular depth estimation).

We also provide the following methods:

Uncertainty quantification models

To date, the following deep learning uncertainty quantification modes have been implemented. Click :inbox_tray: on the methods for tutorials:

Augmentation methods

The following data augmentation methods have been implemented:

  • Mixup, MixupIO, RegMixup, WarpingMixup
  • Modernized corruptions to evaluate model performance under distribution shift

Post-processing methods

To date, the following post-processing methods have been implemented:

Official Implementations

It includes the official codes of the following papers:

  • Packed-Ensembles for Efficient Uncertainty Estimation - ICLR 2023 - Tutorial
  • LP-BNN: Encoding the latent posterior of Bayesian Neural Networks for uncertainty quantification - IEEE TPAMI 2023
  • MUAD: Multiple Uncertainties for Autonomous Driving, a benchmark for multiple uncertainty types and tasks - BMVC 2022

Tutorials

Check out all our tutorials at torch-uncertainty.github.io/auto_tutorials.

:telescope: Projects using TorchUncertainty

The following projects use TorchUncertainty:

  • Towards Understanding and Quantifying Uncertainty for Text-to-Image Generation - CVPR 2025
  • Towards Understanding Why Label Smoothing Degrades Selective Classification and How to Fix It - ICLR 2025
  • A Symmetry-Aware Exploration of Bayesian Neural Network Posteriors - ICLR 2024

If you are using TorchUncertainty in your project, please let us know, and we will add your project to this list!

Citation

If you use this software, please cite its corresponding paper:

@inproceedings{lafage2025torch_uncertainty,
    title={Torch-Uncertainty: A Deep Learning Framework for Uncertainty Quantification},
    author={Lafage, Adrien and Laurent, Olivier and Gabetni, Firas and Franchi, Gianni},
    booktitle={NeurIPS Datasets and Benchmarks Track},
    year={2025}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_uncertainty-0.10.1.tar.gz (203.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torch_uncertainty-0.10.1-py3-none-any.whl (365.1 kB view details)

Uploaded Python 3

File details

Details for the file torch_uncertainty-0.10.1.tar.gz.

File metadata

  • Download URL: torch_uncertainty-0.10.1.tar.gz
  • Upload date:
  • Size: 203.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for torch_uncertainty-0.10.1.tar.gz
Algorithm Hash digest
SHA256 3434f2ad22a02c9956ec3966f92564b82c0077f9b86bb519e7c4338832d06113
MD5 4bf1eba2a928f41535722137c9d969dc
BLAKE2b-256 1f66ae1ca5fc9f5f7be36fb4b1b7c5ee1b04661edf1bd21245e259882fd44ff5

See more details on using hashes here.

Provenance

The following attestation bundles were made for torch_uncertainty-0.10.1.tar.gz:

Publisher: pypi-publish.yml on torch-uncertainty/torch-uncertainty

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file torch_uncertainty-0.10.1-py3-none-any.whl.

File metadata

File hashes

Hashes for torch_uncertainty-0.10.1-py3-none-any.whl
Algorithm Hash digest
SHA256 728c4355bb87efcfb20426f3a31d44ff3d2b45493f5cc4a8a836a6a850dd7743
MD5 0c51eb7a03e4705a3bb8c41452887e92
BLAKE2b-256 2980052e7f98234e8709be357af14789063b7bf8f81f468311e8c42eebe5c4b2

See more details on using hashes here.

Provenance

The following attestation bundles were made for torch_uncertainty-0.10.1-py3-none-any.whl:

Publisher: pypi-publish.yml on torch-uncertainty/torch-uncertainty

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page