Efficient, lightweight, variational inference and approximation bounds
viabel: Variational Inference Approximation Bounds that are Efficient and Lightweight
This package computes bounds errors of the mean, standard deviation, and variances estimates produced by a continuous approximation to a (unnormalized) distribution. A canonical application is a variational approximation to a Bayesian posterior distribution. In particular, using samples from the approximation Q and evaluations of the (maybe unnormalized) log densities of Q and (target distribution) P, the package provides functionality to compute bounds on:
- the α-divergence between P and Q
- the p-Wasserstein distance between P and Q
- the differences between the means, standard deviations, and variances of P and Q
There is also an (optional) variational Bayes functionality (
supports both standard KL-based variational inference (KLVI) and chi-squared
variational inference (CHIVI).
Models are provided as
autograd-compatible log densities or can be constructed
pystan fit objects.
The variational objective is optimized using a windowed version of adagrad
and unbiased reparameterization gradients.
By default there is support for mean-field Gaussian, mean-field Student's t,
and full-rank Student's t variational families.
If you use this package, please cite:
Practical posterior error bounds from variational objectives. Jonathan H. Huggins, Mikołaj Kasprzak, Trevor Campbell, Tamara Broderick. In Proc. of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS), Palermo, Italy. PMLR: Volume 108, 2020.
Compilation and testing
After cloning the repository, testing and installation is easy. If you just want to compute bounds, you can install using the command
pip install .
The only dependency is
numpy. If you want to use the basic variational Bayes
functionality, use the command
pip install .[vb]
This will install some additional dependencies. If in addition to the above, you want to run all of the example notebooks, use the command
pip install .[examples]
This will install even more dependencies.
To test the package:
Currently there is only coverage for
The normal mixture notebook provides basic usage examples of the bounds.
The robust regression example demonstrates how to use the variational Bayes functionality and then compute bounds.
Running Comparison Experiments
The notebooks/experiments.py contains additional
functionality for running experiments and computing PSIS-corrected posterior estimates.
The robust regression example uses some of this functionality.
A simple funnel distribution example demonstrates how to use the high-level
The eight schools example is more involved and realistic.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size viabel-0.3.0.post1-py3-none-any.whl (9.4 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size viabel-0.3.0.post1.tar.gz (8.3 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for viabel-0.3.0.post1-py3-none-any.whl