Efficient, lightweight, variational inference and approximation bounds
Project description
viabel
: Variational Inference Approximation Bounds that are Efficient and Lightweight
Description
This package computes bounds errors of the mean, standard deviation, and variances estimates produced by a continuous approximation to a (unnormalized) distribution. A canonical application is a variational approximation to a Bayesian posterior distribution. In particular, using samples from the approximation Q and evaluations of the (maybe unnormalized) log densities of Q and (target distribution) P, the package provides functionality to compute bounds on:
- the α-divergence between P and Q
- the p-Wasserstein distance between P and Q
- the differences between the means, standard deviations, and variances of P and Q
There is also an (optional) variational Bayes functionality (viabel.vb
), which
supports both standard KL-based variational inference (KLVI) and chi-squared
variational inference (CHIVI).
Models are provided as autograd
-compatible log densities or can be constructed
from pystan
fit objects.
The variational objective is optimized using a windowed version of adagrad
and unbiased reparameterization gradients.
By default there is support for mean-field Gaussian, mean-field Student's t,
and full-rank Student's t variational families.
If you use this package, please cite:
Practical posterior error bounds from variational objectives. Jonathan H. Huggins, Mikołaj Kasprzak, Trevor Campbell, Tamara Broderick. In Proc. of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS), Palermo, Italy. PMLR: Volume 108, 2020.
Compilation and testing
After cloning the repository, testing and installation is easy. If you just want to compute bounds, you can install using the command
pip install .
The only dependency is numpy
. If you want to use the basic variational Bayes
functionality, use the command
pip install .[vb]
This will install some additional dependencies. If in addition to the above, you want to run all of the example notebooks, use the command
pip install .[examples]
This will install even more dependencies.
To test the package:
nosetests tests/
Currently there is only coverage for viabel.bounds
.
Usage Examples
The normal mixture notebook provides basic usage examples of the bounds.
The robust regression example demonstrates how to use the variational Bayes functionality and then compute bounds.
Running Comparison Experiments
The notebooks/experiments.py contains additional
functionality for running experiments and computing PSIS-corrected posterior estimates.
The robust regression example uses some of this functionality.
A simple funnel distribution example demonstrates how to use the high-level run_experiment
function.
The eight schools example is more involved and realistic.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for viabel-0.3.0.post1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b3b817772e42b77ff41cc766e340a6bdcc380d4074cab0286db7a66a2e955ae2 |
|
MD5 | 8fda46ce8477e63f5c41b791cf0cb705 |
|
BLAKE2b-256 | 5869fb7b498eb165581b02b565412272e7aa1a2963fb8f2ca8eb173aba47b575 |