Skip to main content

Bayesian Optimization in PyTorch

Project description

BoTorch Logo

Support Ukraine Lint Test Docs Tutorials Codecov

Conda PyPI License

BoTorch is a library for Bayesian Optimization built on PyTorch.

BoTorch is currently in beta and under active development!

Why BoTorch ?

BoTorch

  • Provides a modular and easily extensible interface for composing Bayesian optimization primitives, including probabilistic models, acquisition functions, and optimizers.
  • Harnesses the power of PyTorch, including auto-differentiation, native support for highly parallelized modern hardware (e.g. GPUs) using device-agnostic code, and a dynamic computation graph.
  • Supports Monte Carlo-based acquisition functions via the reparameterization trick, which makes it straightforward to implement new ideas without having to impose restrictive assumptions about the underlying model.
  • Enables seamless integration with deep and/or convolutional architectures in PyTorch.
  • Has first-class support for state-of-the art probabilistic models in GPyTorch, including support for multi-task Gaussian Processes (GPs) deep kernel learning, deep GPs, and approximate inference.

Target Audience

The primary audience for hands-on use of BoTorch are researchers and sophisticated practitioners in Bayesian Optimization and AI. We recommend using BoTorch as a low-level API for implementing new algorithms for Ax. Ax has been designed to be an easy-to-use platform for end-users, which at the same time is flexible enough for Bayesian Optimization researchers to plug into for handling of feature transformations, (meta-)data management, storage, etc. We recommend that end-users who are not actively doing research on Bayesian Optimization simply use Ax.

Installation

Installation Requirements

  • Python >= 3.7
  • PyTorch >= 1.9
  • gpytorch >= 1.6
  • scipy
Installing the latest release

The latest release of BoTorch is easily installed either via Anaconda (recommended):

conda install botorch -c pytorch -c gpytorch

or via pip:

pip install botorch

You can customize your PyTorch installation (i.e. CUDA version, CPU only option) by following the PyTorch installation instructions.

Important note for MacOS users:

  • Make sure your PyTorch build is linked against MKL (the non-optimized version of BoTorch can be up to an order of magnitude slower in some settings). Setting this up manually on MacOS can be tricky - to ensure this works properly, please follow the PyTorch installation instructions.
  • If you need CUDA on MacOS, you will need to build PyTorch from source. Please consult the PyTorch installation instructions above.
Installing from latest main branch

If you would like to try our bleeding edge features (and don't mind potentially running into the occasional bug here or there), you can install the latest development version directly from GitHub (this will also require installing the current GPyTorch development version):

pip install --upgrade git+https://github.com/cornellius-gp/gpytorch.git
pip install --upgrade git+https://github.com/pytorch/botorch.git

Manual / Dev install

Alternatively, you can do a manual install. For a basic install, run:

git clone https://github.com/pytorch/botorch.git
cd botorch
pip install -e .

To customize the installation, you can also run the following variants of the above:

  • pip install -e .[dev]: Also installs all tools necessary for development (testing, linting, docs building; see Contributing below).
  • pip install -e .[tutorials]: Also installs all packages necessary for running the tutorial notebooks.

Getting Started

Here's a quick run down of the main components of a Bayesian optimization loop. For more details see our Documentation and the Tutorials.

  1. Fit a Gaussian Process model to data
import torch
from botorch.models import SingleTaskGP
from botorch.fit import fit_gpytorch_model
from gpytorch.mlls import ExactMarginalLogLikelihood

train_X = torch.rand(10, 2)
Y = 1 - (train_X - 0.5).norm(dim=-1, keepdim=True)  # explicit output dimension
Y += 0.1 * torch.rand_like(Y)
train_Y = (Y - Y.mean()) / Y.std()

gp = SingleTaskGP(train_X, train_Y)
mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
fit_gpytorch_model(mll)
  1. Construct an acquisition function
from botorch.acquisition import UpperConfidenceBound

UCB = UpperConfidenceBound(gp, beta=0.1)
  1. Optimize the acquisition function
from botorch.optim import optimize_acqf

bounds = torch.stack([torch.zeros(2), torch.ones(2)])
candidate, acq_value = optimize_acqf(
    UCB, bounds=bounds, q=1, num_restarts=5, raw_samples=20,
)

Citing BoTorch

If you use BoTorch, please cite the following paper:

M. Balandat, B. Karrer, D. R. Jiang, S. Daulton, B. Letham, A. G. Wilson, and E. Bakshy. BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization. Advances in Neural Information Processing Systems 33, 2020.

@inproceedings{balandat2020botorch,
  title={{BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization}},
  author={Balandat, Maximilian and Karrer, Brian and Jiang, Daniel R. and Daulton, Samuel and Letham, Benjamin and Wilson, Andrew Gordon and Bakshy, Eytan},
  booktitle = {Advances in Neural Information Processing Systems 33},
  year={2020},
  url = {http://arxiv.org/abs/1910.06403}
}

See here for an incomplete selection of peer-reviewed papers that build off of BoTorch.

Contributing

See the CONTRIBUTING file for how to help out.

License

BoTorch is MIT licensed, as found in the LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

botorch-0.6.3.tar.gz (420.3 kB view details)

Uploaded Source

Built Distribution

botorch-0.6.3-py3-none-any.whl (357.5 kB view details)

Uploaded Python 3

File details

Details for the file botorch-0.6.3.tar.gz.

File metadata

  • Download URL: botorch-0.6.3.tar.gz
  • Upload date:
  • Size: 420.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.12

File hashes

Hashes for botorch-0.6.3.tar.gz
Algorithm Hash digest
SHA256 123285bc1f4083461984e20736a1ff4dca5da31aeb93a626dcf736f5597ecf5b
MD5 86d7eeddab5f13e4b6b1eff5f9df59f0
BLAKE2b-256 21bf35bbf912351e87784d31d43d4b7e8cbf7fe0bce41866a5444d35a0d05fed

See more details on using hashes here.

File details

Details for the file botorch-0.6.3-py3-none-any.whl.

File metadata

  • Download URL: botorch-0.6.3-py3-none-any.whl
  • Upload date:
  • Size: 357.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.12

File hashes

Hashes for botorch-0.6.3-py3-none-any.whl
Algorithm Hash digest
SHA256 cbf57cc5b56b3ec500739bffa52fb0c77f7292370c02077defa2f56d144b6b03
MD5 9239d7cd45da020b1b1bd48fb58288bb
BLAKE2b-256 d2fa407e4698ffe52c62a08cdfe10e4ee532be793ed8f7f6951d3350c5f06baf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page