Bayesian optimization interface for the laplace-torch library
Project description
Bayesian Optimization Interface for laplace-torch
Installation
Install PyTorch first, then:
pip install git+https://git@github.com/aleximmer/laplace
pip install git+https://git@github.com/wiseodd/laplace-bayesopt
Usage
Basic usage
from laplace_bayesopt.botorch import LaplaceBoTorch
def get_net():
# Return a *freshly-initialized* PyTorch model
return torch.nn.Sequential(
...
)
# Initial X, Y pairs, e.g. obtained via random search
train_X, train_Y = ..., ...
model = LaplaceBoTorch(get_net, train_X, train_Y)
# Use this model in your existing BoTorch loop, e.g. to replace BoTorch's MultiTaskGP model.
The full arguments of LaplaceBoTorch
can be found in the class documentation.
Check out a full BoTorch example in examples/botorch/experiments.py
.
Useful References
- General Laplace approximation: https://arxiv.org/abs/2106.14806
- Laplace for Bayesian optimization: https://arxiv.org/abs/2304.08309
- Benchmark of neural-net-based Bayesian optimizers: https://arxiv.org/abs/2305.20028
- The case for neural networks for Bayesian optimization: https://arxiv.org/abs/2104.11667
Citation
@inproceedings{kristiadi2023promises,
title={Promises and Pitfalls of the Linearized {L}aplace in {B}ayesian Optimization},
author={Kristiadi, Agustinus and Immer, Alexander and Eschenhagen, Runa and Fortuin, Vincent},
booktitle={AABI},
year={2023}
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
laplace_bayesopt-0.1.0.tar.gz
(8.4 kB
view hashes)
Built Distribution
Close
Hashes for laplace_bayesopt-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ce1b86e9ae0b4bd6bee3a90290b659d565639bc148bbf6c59f816dae54678951 |
|
MD5 | ac217b3288e843f6dc0539cffbb6e136 |
|
BLAKE2b-256 | 0386bfb2701fc4e5e4cb37ce5eb60465b0a804bd1a6948bc8afd5c1b029b7a60 |