Skip to main content

Boax is a Bayesian Optimization library for JAX.

Project description

Boax: A Bayesian Optimization library for JAX.

tests pypi

Overview | Installation | Getting Started | Documentation

Boax is currently in early alpha and under active development!

Overview

Boax is a composable library of core components for Bayesian Optimization that is designed for flexibility. It comes with low-level interfaces for:

  • Core capabilities (boax.core):
    • Common Distributions
    • Monte-Carlo Samplers
  • Fitting a surrogate model to data (boax.prediction):
    • Model Functions
    • Objective Functions
  • Constructing and optimizing acquisition functions (boax.optimization):
    • Acquisition Functions
    • Optimizer Functions

Installation

You can install the latest released version of Boax from PyPI via:

pip install boax

or you can install the latest development version from GitHub:

pip install git+https://github.com/Lando-L/boax.git

Basic Usage

Here is a basic example of using the Boax API for defining a Gaussian Process model, constructing an Acquisition function, and generating the next batch of data points to query. For more details check out the docs.

  1. Defining a Gaussian Process model:
from boax.prediction import models

model = models.gaussian_process.exact(
  models.means.zero(),
  models.kernels.scaled(
    models.kernels.rbf(1.0), 0.5
  ),
  models.likelihoods.gaussian(1e-4),
  x_train,
  y_train,
)
  1. Constructing an Acquisition function.
from jax import vmap
from boax.optimization import acquisitions

acqf = models.outcome_transformed(
  vmap(model),
  acquisitions.upper_confidence_bound(2.0)
)
  1. Generating the next batch of data points to query.
from jax import numpy as jnp
from jax import random
from boax.core import distributions, samplers
from boax.optimization import optimizers

key = random.key(0)

batch_size, num_results, num_restarts = 1, 100, 10
bounds = jnp.array([[-1.0, 1.0]])

sampler = samplers.halton_uniform(
  distributions.uniform.uniform(bounds[:, 0], bounds[:, 1])
)

optimizer = optimizers.batch(
  optimizers.initializers.q_batch(
    acqf, sampler, batch_size, num_results, num_restarts,
  ),
  optimizers.solvers.scipy(
    acqf, bounds,  
  ),
)

next_x, value = optimizer(key)

Citing Boax

To cite Boax please use the citation:

@software{boax2023github,
  author = {Lando L{\"o}per},
  title = {{B}oax: A Bayesian Optimization library for {JAX}},
  url = {https://github.com/Lando-L/boax},
  version = {0.1.2},
  year = {2023},
}

In the above bibtex entry, the version number is intended to be that from boax/version.py, and the year corresponds to the project's open-source release.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

boax-0.1.2.tar.gz (27.6 kB view details)

Uploaded Source

Built Distribution

boax-0.1.2-py3-none-any.whl (70.1 kB view details)

Uploaded Python 3

File details

Details for the file boax-0.1.2.tar.gz.

File metadata

  • Download URL: boax-0.1.2.tar.gz
  • Upload date:
  • Size: 27.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for boax-0.1.2.tar.gz
Algorithm Hash digest
SHA256 ffd2b3a1049f47534a05c5a1ae7c1051b509a6276170dda7400db548ee3b9f2d
MD5 be4187642d4e26edc1dcbcd232f8928f
BLAKE2b-256 4f788b0a5f754647734b4ade1a7f048686ca8b9424ac6756c7a3f689ac507c18

See more details on using hashes here.

File details

Details for the file boax-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: boax-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 70.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for boax-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 92c5e7f914e3fd887fd46601beebae5b070eea9390a3f948a7f9a9d65eb46fd9
MD5 ea6008d292fd9c7f3415a874f051caf3
BLAKE2b-256 7b9290924d881d02fc78d0cfb256b1a864924d477978af666c349f0f60db3b6c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page