Skip to main content

Stacking Variational Bayesian Monte Carlo

Project description

Stacking Variational Bayesian Monte Carlo (S-VBMC)

Overview

Stacking Variational Bayesian Monte Carlo (S-VBMC)[1] is a fast post-processing step for Variational Bayesian Monte Carlo (VBMC). VBMC is an approximate Bayesian inference technique that produces a variational posterior in the form of a Gaussian mixture (see the relevant papers [2-4] for more details). S-VBMC improves upon this by combining ("stacking") the Gaussian mixture components from several independent VBMC runs into a single, larger mixture, which we call "stacked posterior". It then re-optimizes the weights of this combined mixture to maximize the combined Evidence Lower BOund (ELBO, a lower bound on log model evidence).

A key advantage of S-VBMC is its efficiency: the original model is never re-evaluated, making it an inexpensive way to boost inference performance. Furthermore, no communication is needed among VBMC runs, making it possible to run them in parallel before applying S-VBMC as a post-processing step with negligible computational overhead.

Refer to the S-VBMC paper for further details [1].

When to use S-VBMC

S-VBMC works as a post-processing step for VBMC, so it shares its use cases (described here).

Performing several VBMC inference runs with different initialization points is already recommended by the developers for robustness and convergence diagnostics; therefore, S-VBMC naturally fits into VBMC's best practices. Because S-VBMC is inexpensive and effective, we recommend using it whenever you first perform inference with VBMC. It is especially useful when separate VBMC runs yield noticeably different variational posteriors, which might happen when the target distribution has a particularly complex shape (see this notebook for two examples of this).


How to use S-VBMC

1. Installation

Create a new environment in conda and activate it:

conda create -n svbmc python=3.11
conda activate svbmc

Install svbmc:

  1. Clone the repo:
git clone https://github.com/acerbilab/svbmc.git
  1. Install from the cloned repository folder:
pip install -e . 

2. Running S-VBMC

You should have already run VBMC multiple times on the same problem and saved the resulting VariationalPosterior objects as .pkl files. Refer to these notebooks for VBMC usage examples.

First, load these objects into a single list. For example, if you have your files in a folder named vbmc_runs/:

import pickle
import glob

vp_files = glob.glob("vbmc_runs/*.pkl")
vp_list = []
for file in vp_files:
    with open(file, "rb") as f:
        vp_list.append(pickle.load(f))

Next, initialize the SVBMC object with this list and run the optimization.

from svbmc.svbmc import SVBMC

# Initialize the SVBMC object and optimize the weights
vp_stacked = SVBMC(vp_list=vp_list)
vp_stacked.optimize()

# The SVBMC object now contains the optimized weights and ELBO estimates
print(f"Stacked ELBO: {vp_stacked.elbo['estimated']}")

For a detailed walkthrough, see this notebook, which optionally includes a minimal guide on how to run VBMC multiple times. Additionally, this notebook addresses scenarios where the target log-density evaluations are noisy.

Note: For compatibility with VBMC, this implementation of S-VBMC stores results in NumPy arrays. However, it uses PyTorch under the hood to run the ELBO optimization.


⚠️ Important: how to use the final posterior

You must use samples from the stacked posterior for any application and should not interpret its individual components' sufficient statistics (means and covariance matrices).

This is because each VBMC run may use different internal parameter transformations. Consequently, the component means and covariance matrices from different VBMC posteriors exist in incompatible parameter spaces. Combining them creates a mixture whose individual Gaussian components are not directly meaningful.

Always use samples from the final stacked posterior, which are correctly transformed back into the original parameter space. These are available via the .sample() method:

# Draw 10,000 samples from the final, stacked posterior
samples = vp_stacked.sample(n_samples=10000)

References and citation

  1. Silvestrin, F., Li, C., & Acerbi, L. (2025). Stacking Variational Bayesian Monte Carlo. arXiv preprint arXiv:2504.05004. (paper on arXiv)
  2. Acerbi, L. (2018). Variational Bayesian Monte Carlo. In Advances in Neural Information Processing Systems 31: 8222-8232. (paper + supplement on arXiv, NeurIPS Proceedings)
  3. Acerbi, L. (2020). Variational Bayesian Monte Carlo with Noisy Likelihoods. In Advances in Neural Information Processing Systems 33: 8211-8222 (paper + supplement on arXiv, NeurIPS Proceedings).
  4. Huggins, B., Li, C., Tobaben, M., Aarnos, M., & Acerbi, L. (2023). PyVBMC: Efficient Bayesian inference in Python. Journal of Open Source Software 8(86), 5428, https://doi.org/10.21105/joss.05428.

Please cite all four references if you use S-VBMC in your work.

Additional references

  1. Acerbi, L. (2019). An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo. In Proc. Machine Learning Research 96: 1-10. 1st Symposium on Advances in Approximate Bayesian Inference, Montréal, Canada. (paper in PMLR)

BibTeX

@article{silvestrin2025stacking,
  title={{S}tacking {V}ariational {B}ayesian {M}onte Carlo},
  author={Silvestrin, Francesco and Li, Chengkun and Acerbi, Luigi},
  journal={arXiv preprint arXiv:2504.05004},
  year={2025}
}

@article{acerbi2018variational,
  title={{V}ariational {B}ayesian {M}onte {C}arlo},
  author={Acerbi, Luigi},
  journal={Advances in Neural Information Processing Systems},
  volume={31},
  pages={8222--8232},
  year={2018}
}

@article{acerbi2020variational,
  title={{V}ariational {B}ayesian {M}onte {C}arlo with noisy likelihoods},
  author={Acerbi, Luigi},
  journal={Advances in Neural Information Processing Systems},
  volume={33},
  pages={8211--8222},
  year={2020}
}

@article{huggins2023pyvbmc,
    title = {PyVBMC: Efficient Bayesian inference in Python},
    author = {Bobby Huggins and Chengkun Li and Marlon Tobaben and Mikko J. Aarnos and Luigi Acerbi},
    publisher = {The Open Journal},
    journal = {Journal of Open Source Software},
    url = {https://doi.org/10.21105/joss.05428},
    doi = {10.21105/joss.05428},
    year = {2023},
    volume = {8},
    number = {86},
    pages = {5428}
  }

@article{acerbi2019exploration,
  title={An Exploration of Acquisition and Mean Functions in {V}ariational {B}ayesian {M}onte {C}arlo},
  author={Acerbi, Luigi},
  journal={PMLR},
  volume={96},
  pages={1--10},
  year={2019}
}

License

S-VBMC is released under the terms of the BSD 3-Clause License.

Acknowledgments

S-VBMC was developed by members of the Machine and Human Intelligence Lab at the University of Helsinki.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

svbmc-0.1.0.tar.gz (30.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

svbmc-0.1.0-py3-none-any.whl (21.4 kB view details)

Uploaded Python 3

File details

Details for the file svbmc-0.1.0.tar.gz.

File metadata

  • Download URL: svbmc-0.1.0.tar.gz
  • Upload date:
  • Size: 30.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.13

File hashes

Hashes for svbmc-0.1.0.tar.gz
Algorithm Hash digest
SHA256 662c71ef72fcda47632b6a82e2613a12f40b93a05ba0f79244a9e2185bd3a9b0
MD5 c27b70a0a29af7a4fec98db76c1c5436
BLAKE2b-256 8a220235a72608de5a124b9c80ad455df2eabc26349b7d3209c223ae84a3f7de

See more details on using hashes here.

File details

Details for the file svbmc-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: svbmc-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 21.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.13

File hashes

Hashes for svbmc-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3ac2795c836a0354ac200f87bbfb6c32b46c69a82ed44ac9a4e3346b046424fc
MD5 3b455d1d62a80129cfe07c0833d8bda8
BLAKE2b-256 ba620ddf05669a4c1be33ceb18871739bf54fc7a39f64698b5b3c6ea81e53b97

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page