Skip to main content

Fast Bayesian optimization, quadrature, inference over arbitrary domain (discrete and mixed spaces) with GPU parallel acceleration based on GPytorch and BoTorch.

Project description

SOBER

Fast Bayesian optimization, quadrature, inference over arbitrary domain (discrete and mixed spaces) with GPU parallel acceleration based on GPytorch and BoTorch. The paper is here arXiv,

Animate

While the existing method (batch Thompson sampling; TS) is stuck in the local minima, SOBER robustly finds global optimmum.
SOBER provides a faster, more sample-efficient, more diversified, and more scalable optimization scheme than existing methods.
In the paper, SOBER outperformed 11 competitive baselines on 12 synthetic and diverse real-world tasks.

  • Red star: ground truth
  • black crosses: next batch queries recommended by SOBER
  • white dots: historical observations
  • Branin function: blackbox function to maximise
  • $\pi$: the probability of global optimum locations estimated by SOBER

Features

  • fast batch Bayesian optimization
  • fast batch Bayesian quadrature
  • fast Bayesian inference
  • fast fully Bayesian Gaussian process modelling and related acquisition functions
  • sample-efficient simulation-based inference
  • Massively parallel active learning
  • GPU acceleration
  • Arbitrary domain space (continuous, discrete, mixture, or domain space as dataset)
  • Arbitrary kernel for surrogate modelling
  • Arbitrary acquisition function
  • Arbitrary prior distribution for Bayesian inference
  • Expectation Propagation for feature-by-feature Bayesian Optimization
  • inverse modelling for training Gaussian Processes as fast optimization surrogates

Tutorials for practitioners/researchers

To get started with SOBER with minimal issues, use the guided interface. This will show you the customization options with SOBER.
We also prepared detailed explanations about how to customize SOBER for your tasks, which use SOBER directly for a greater depth of understanding.
See tutorials.

  • 00 Quick start
  • 01 How does SOBER work?
  • 02 Customise prior for various domain types
  • 03 Customise acquisition function
  • 04 Fast fully Bayesian Gaussian process modelling
  • 05 Fast Bayesian inference for simulation-based inference
  • 06 Tips for drug discovery
  • 07 Compare with Thompson sampling
  • 08 Benchmarking against batch BO methods.ipynb

Examples

See examples for reproducing the results in the paper.

Installation

Please download the newest .whl file from Releases. If you wish to build from source, git clone the repository and run the following commands in the top folder:

pip install build
python -m build

You will find the packaged library in the dist folder.

Brief explanation

plot

We solve batch global optimization as Bayesian quadrature; plot
We select the batch query locations to minimize the integration error of the true function $f_\text{true}$ over the probability measure $\pi$. $\pi$ is the probability of global optimum locations estimated by SOBER and becomes confident (shrink toward true global optima) over iterations.

Requirements

  • PyTorch
  • GPyTorch
  • BoTorch

Acknowledgement

This code repository uses materials from the following public and provided codes. The authors thank the respective repository maintainers

  • BASQ: Adachi, M., Hayakawa, S., Jørgensen, M., Oberhauser, H., Osborne, M. A., Fast Bayesian Inference with Batch Bayesian Quadrature via Kernel Recombination. Advances in Neural Information Processing Systems, 35 (NeurIPS 2022) code, paper
  • RCHQ: Hayakawa, S., Oberhauser, H., Lyons, T., Positively Weighted Kernel Quadrature via Subsampling. Advances in Neural Information Processing Systems, 35 (NeurIPS 2022) code, paper
  • Thompson sampling: Kandasamy, K., Krishnamurthy, A., Schneider, J. and Póczos, B., Parallelised Bayesian optimisation via Thompson sampling. International Conference on Artificial Intelligence and Statistics (AISTATS 2018) code from BoTorch, paper
  • Decoupled Thompson sampling: Wilson, J., Borovitskiy, V., Terenin, A., Mostowsky, P. and Deisenroth, M., Efficiently sampling functions from Gaussian process posteriors. International Conference on Machine Learning (ICML 2020) code from @saitcakmak, paper
  • Determinantal Point Process Thompson sampling (DPP-TS): Nava, E., Mutny, M. and Krause, A., Diversified sampling for batched bayesian optimization with determinantal point processes. International Conference on Artificial Intelligence and Statistics (AISTATS 2022). We appreciate the paper authors providing the code and allowing us to open-source here @elvisnava, paper
  • GIBBON: Moss, H.B., Leslie, D.S., Gonzalez, J. and Rayson, P., 2021. Gibbon: General-purpose information-based bayesian optimisation. The Journal of Machine Learning Research, 22(1), (JMLR 2021) code from BoTorch. paper
  • TurBO: Eriksson, D., Pearce, M., Gardner, J. R., Turner, R., & Poloczek, M. (2019). Scalable global optimization via local Bayesian optimization. Advances in Neural Information Processing Systems, 32 (NeurIPS 2019) code from BoTorch, paper

Cite as

Please cite this work as

@article{adachi2024a,
  title={A Quadrature Approach for General-Purpose Batch Bayesian Optimization via Probabilistic Lifting},
  author={Adachi, Masaki and Hayakawa, Satoshi and J{\o}rgensen, Martin and Hamid, Saad and Oberhauser, Harald and Osborne, Michael A},
  journal={https://doi.org/10.48550/arXiv.2404.12219},
  year={2024}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sober_bo-2.0.4.tar.gz (67.8 kB view details)

Uploaded Source

Built Distribution

sober_bo-2.0.4-py3-none-any.whl (78.4 kB view details)

Uploaded Python 3

File details

Details for the file sober_bo-2.0.4.tar.gz.

File metadata

  • Download URL: sober_bo-2.0.4.tar.gz
  • Upload date:
  • Size: 67.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.0

File hashes

Hashes for sober_bo-2.0.4.tar.gz
Algorithm Hash digest
SHA256 2ec28d8dc92b42b35c5c0959ac05e9eeb7f9eef849a1ff9c140cf75ba9070c90
MD5 e050fbef2b1771a3639d78921fef3642
BLAKE2b-256 bf220cf562c70c89bb0c7feba4edbf6a2e0d38d02a87514ca547ef41358981db

See more details on using hashes here.

File details

Details for the file sober_bo-2.0.4-py3-none-any.whl.

File metadata

  • Download URL: sober_bo-2.0.4-py3-none-any.whl
  • Upload date:
  • Size: 78.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.0

File hashes

Hashes for sober_bo-2.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e632654eb1cc90f2e5925e29f825ea16437aead6ce7053c131876a421141dcfa
MD5 c9ebc9abe03da65158573305f57fe2ad
BLAKE2b-256 5888d56260b1d45a2d7e047d6a432c17657679aa77a63fb965e537a775214327

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page