Skip to main content

Uncertainty quantification with PyTorch

Project description

logo

Installation | Quickstart | Methods | Friends | Contributing | Citation | Documentation | Paper

What is posteriors?

General purpose python library for uncertainty quantification with PyTorch.

  • Composable: Use with transformers, lightning, torchopt, torch.distributions, pyro and more!
  • Extensible: Add new methods! Add new models!
  • Functional: Easier to test, closer to mathematics!
  • Scalable: Big model? Big data? No problem!
  • Swappable: Swap between algorithms with ease!

Installation

posteriors is available on PyPI and can be installed via pip:

pip install posteriors

Quickstart

posteriors is functional first and aims to be easy to use and extend. Let's try it out by training a simple model with variational inference:

from torchvision.datasets import MNIST
from torchvision.transforms import ToTensor
from torch import nn, utils, func
import torchopt
import posteriors

dataset = MNIST(root="./data", transform=ToTensor())
train_loader = utils.data.DataLoader(dataset, batch_size=32, shuffle=True)
num_data = len(dataset)

classifier = nn.Sequential(nn.Linear(28 * 28, 64), nn.ReLU(), nn.Linear(64, 10))
params = dict(classifier.named_parameters())


def log_posterior(params, batch):
    images, labels = batch
    images = images.view(images.size(0), -1)
    output = func.functional_call(classifier, params, images)
    log_post_val = (
        -nn.functional.cross_entropy(output, labels)
        + posteriors.diag_normal_log_prob(params) / num_data
    )
    return log_post_val, output


transform = posteriors.vi.diag.build(
    log_posterior, torchopt.adam(), temperature=1 / num_data
)  # Can swap out for any posteriors algorithm

state = transform.init(params)

for batch in train_loader:
    state, aux = transform.update(state, batch)

Observe that posteriors recommends specifying log_posterior and temperature such that log_posterior remains on the same scale for different batch sizes. posteriors algorithms are designed to be stable as temperature goes to zero.

Further, the output of log_posterior is a tuple containing the evaluation (single-element Tensor) and an additional argument (TensorTree) containing any auxiliary information we'd like to retain from the model call, here the model predictions. If you have no auxiliary information, you can simply return torch.tensor([]) as the second element. For more info see torch.func.grad (with has_aux=True) or the documentation.

Check out the tutorials for more detailed usage!

Methods

posteriors supports a variety of methods for uncertainty quantification, including:

With full details available in the API documentation.

posteriors is designed to be easily extensible, if you're favorite method is not listed above, raise an issue and we'll see what we can do!

Friends

Interfaces seamlessly with:

The functional transform interface is strongly inspired by frameworks such as optax and blackjax.

As well as other UQ libraries fortuna, laplace, numpyro, pymc and uncertainty-baselines.

Contributing

You can report a bug or request a feature by creating a new issue on GitHub.

If you want to contribute code, please check the contributing guide.

Citation

If you use posteriors in your research, please cite the library using the following BibTeX entry:

@article{duffield2024scalable,
  title={Scalable Bayesian Learning with posteriors},
  author={Duffield, Samuel and Donatella, Kaelan and Chiu, Johnathan and Klett, Phoebe and Simpson, Daniel},
  journal={arXiv preprint arXiv:2406.00104},
  year={2024}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

posteriors-0.1.2.tar.gz (40.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

posteriors-0.1.2-py3-none-any.whl (51.7 kB view details)

Uploaded Python 3

File details

Details for the file posteriors-0.1.2.tar.gz.

File metadata

  • Download URL: posteriors-0.1.2.tar.gz
  • Upload date:
  • Size: 40.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for posteriors-0.1.2.tar.gz
Algorithm Hash digest
SHA256 c1f4a1a8348ab1284221398819be12b6519254172ef3406781d7d2eb9b6079a7
MD5 f479fc7ef193a3c4b8ced862656be142
BLAKE2b-256 0c2149b6298aa8e1ce6bcc53382aec5d9f5e9111dc0d5026e73ace85c0a52fb9

See more details on using hashes here.

Provenance

The following attestation bundles were made for posteriors-0.1.2.tar.gz:

Publisher: publish_pypi.yaml on normal-computing/posteriors

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file posteriors-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: posteriors-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 51.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for posteriors-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 de715a4fe8cb66e9ecfecd5c31a85703d6a2172034bc5fbba8e9682011727e9a
MD5 cc7cfcf4fd4417ad2700085dd1de3d4b
BLAKE2b-256 5b354189778766a9ca004b449c9e0c898ba6c5f19ef3e8d449b6c25c335f2330

See more details on using hashes here.

Provenance

The following attestation bundles were made for posteriors-0.1.2-py3-none-any.whl:

Publisher: publish_pypi.yaml on normal-computing/posteriors

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page