Skip to main content

Learn fast, scalable, and calibrated measures of uncertainty using neural networks!

Project description

Evidential Deep Learning

"All models are wrong, but some — that know when they can be trusted — are useful!"

- George Box (Adapted)

This repository contains the code to reproduce Deep Evidential Regression, as published in NeurIPS 2020, as well as more general code to leverage evidential learning to train neural networks to learn their own measures of uncertainty directly from data!

Setup

To use this package, you must install the following dependencies first:

  • python (>=3.7)
  • tensorflow (>=2.0)
  • pytorch (support coming soon)

Now you can install to start adding evidential layers and losses to your models!

pip install evidential-deep-learning

Now you're ready to start using this package directly as part of your existing tf.keras model pipelines (Sequential, Functional, or model-subclassing):

>>> import evidential_deep_learning as edl

Example

To use evidential deep learning, you must edit the last layer of your model to be evidential and use a supported loss function to train the system end-to-end. This repository supports evidential layers for both fully connected and convolutional (2D) layers. The evidential prior distribution presented in the paper follow a Normal Inverse-Gamma and can be added to your model:

import evidential_deep_learning as edl
import tensorflow as tf

model = tf.keras.Sequential(
    [
        tf.keras.layers.Dense(64, activation="relu"),
        tf.keras.layers.Dense(64, activation="relu"),
        edl.layers.DenseNormalGamma(1), # Evidential distribution!
    ]
)
model.compile(
    optimizer=tf.keras.optimizers.Adam(1e-3), 
    loss=edl.losses.EvidentialRegression # Evidential loss!
)

Checkout hello_world.py for an end-to-end toy example walking through this step-by-step. For more complex examples, scaling up to computer vision problems (where we learn to predict tens of thousands of evidential distributions simultaneously!), please refer to the NeurIPS 2020 paper, and the reproducibility section of this repo to run those examples.

Reproducibility

All of the results published as part of our NeurIPS paper can be reproduced as part of this repository. Please refer to the reproducibility section for details and instructions to obtain each result.

Citation

If you use this code for evidential learning as part of your project or paper, please cite the following work:

@article{amini2020deep,
  title={Deep evidential regression},
  author={Amini, Alexander and Schwarting, Wilko and Soleimany, Ava and Rus, Daniela},
  journal={Advances in Neural Information Processing Systems},
  volume={33},
  year={2020}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evidential_deep_learning-0.4.0.tar.gz (5.1 kB view details)

Uploaded Source

File details

Details for the file evidential_deep_learning-0.4.0.tar.gz.

File metadata

  • Download URL: evidential_deep_learning-0.4.0.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.23.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.7.7

File hashes

Hashes for evidential_deep_learning-0.4.0.tar.gz
Algorithm Hash digest
SHA256 517d4eee2e227ac31eafebb691801a3f2e9892212c721c9f31c50592dcecc4ee
MD5 f80b3a00d62d0bb14ba7a7a2e357223f
BLAKE2b-256 e15f86f5fb9aca35ed7e3d824f18ccc3dc4650f53e1db80428b4505a546f5c67

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page