Skip to main content

BoTier: Multi-Objective Bayesian Optimization with Tiered Composite Objectives

Project description

workflow coverage Docs

PyPI - Version License: MIT Python Version

arXiv

BOTier: Multi-Objective Bayesian Optimization with Tiered Preferences over Experiment Outcomes and Inputs

Next to the primary optimization objectives, scientific optimization problems often contain a series of subordinate objectives, which can be expressed as preferences over either the outputs of an experiment, or the experiment inputs (e.g. to minimize the experimental cost). BoTier provides a flexible composite objective to express hierarchical user preferences over both experiment inputs and outputs. The details are described in the corresponding paper.

botieris a lightweight plug-in for botorch, and can be readily integrated with the botorch ecosystem for Bayesian Optimization.

Installation

botier can be readily installed from the Python Package Index (PyPI).

pip install botier

Usage

The following code snippet shows a minimal example of using the hierarchical scalarization objective

In this example, our primary goal is to maximize the $\sin(2\pi x)$ function to a value of min. 0.5. If this is satisfied, the value of x should be minimized.

import torch
from gpytorch.mlls import ExactMarginalLogLikelihood
from botorch.models import SingleTaskGP
from botorch.fit import fit_gpytorch_mll
from botorch.acquisition.monte_carlo import qExpectedImprovement
from botorch.optim import optimize_acqf

import numpy as np
from matplotlib import pyplot as plt

from botier import AuxiliaryObjective, HierarchyScalarizationObjective

# define the 'auxiliary objectives' that eventually make up the overall optimization objective
objectives = [
    AuxiliaryObjective(output_index=0, abs_threshold=0.5, upper_bound=1.0, lower_bound=-1.0),
    AuxiliaryObjective(maximize=False, calculation=lambda y, x: x[..., 0], abs_threshold=0.0, lower_bound=0.0, upper_bound=1.0),
]
global_objective = HierarchyScalarizationObjective(objectives, k=1E2, normalized_objectives=True)

# generate some training data
train_x = torch.rand(5, 1).double()
train_y = torch.sin(2 * torch.pi * train_x)

budget = 20
for n in range(budget):
    
    # fit a simple BoTorch surrogate model
    surrogate = SingleTaskGP(train_x, train_y)
    mll = ExactMarginalLogLikelihood(surrogate.likelihood, surrogate)
    fit_gpytorch_mll(mll)

    # instantiate a BoTorch Monte-Carlo acquisition function using the botier.HierarchyScalarizationObjective as the 'objective' argument
    acqf = qExpectedImprovement(
        model=surrogate,
        objective=global_objective,
        best_f=torch.max(train_y)
    )

    new_candidate, _ = optimize_acqf(acqf, bounds=torch.tensor([[0.0], [1.0]]), q=1, num_restarts=5, raw_samples=512)


    # evaluate the global objective
    new_candidate_y = torch.sin(2 * torch.pi * new_candidate)

    # update the training points
    train_x = torch.cat([train_x, new_candidate])
    train_y = torch.cat([train_y, new_candidate_y])

    print(f"iteration {n + 1}: candidate={new_candidate.item()}, objective={new_candidate_y.item()}")


plt.plot(np.linspace(0, 1, 100), torch.sin(2 * torch.pi * torch.linspace(0, 1, 100)), label="true function", zorder=0)
plt.scatter(train_x.numpy(), train_y.numpy(), s=25, marker="x", cmap="spring", c=np.arange(len(train_x)), label="selected points")
plt.colorbar()
plt.legend()
plt.show()

For more detailed usage examples, see examples.

Contributors

Felix Strieth-Kalthoff (@felix-s-k), Mohammad Haddadnia (@mohaddadnia), Leonie Grashoff (@lgrashoff)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

botier-1.0.0.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

botier-1.0.0-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file botier-1.0.0.tar.gz.

File metadata

  • Download URL: botier-1.0.0.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for botier-1.0.0.tar.gz
Algorithm Hash digest
SHA256 48b2c4da727278a5611b89f1203f112c6873d476f64c762c0c4d0b061896558d
MD5 e20d471a1e8e7c4dbb51cbf762a1c098
BLAKE2b-256 62629821ee52f8227a717d7a3d1096b1d3e1fa14a6ff58c213ce86c5a38b677d

See more details on using hashes here.

File details

Details for the file botier-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: botier-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 10.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for botier-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9177990c1f650373925b19dafa366506f145bb415629bcf6860ed0ae839b7fb6
MD5 0003d60abcc06f8e7117760fae88c5e2
BLAKE2b-256 0fe7b5db038ad38cebb9341be4a1b7a1f17559e7827b9485136190ff492c1927

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page