Skip to main content

Solve multi-objective problems using Bayesian optimisation.

Project description

OptiMOBO

Solve bi-objective multi-objective problems using multi-objective bayesian optimisation (MOBO).

This repo contains implementations of two MOBO methods. They are designed to solve bi-objective minimisation problems, constraints are not supported. The methods include:

  • Mono-surrogate. This uses a single model to optimise. Objective vectors are aggregated into a single scalar value and a Gaussian process is built upon the scalarised values.
  • Multi-surrogate. This method uses multiple models. One model for each objective. Multi-objective acquisition functions are used to identify new sample points.

The two methods are written as two classes MultiSurrogateOptimiser and MonoSurrogateOptimiser. They are designed to solve problems that inherit from the Problem class presented in the library pymoo.

Examples

The following code defines a bi-objective problem, MyProblem, and uses multi-surrogate Bayesian optimisation (utilising Tchebicheff aggregation as an acquisition function) to solve.

import numpy as np
from optimisers import MultiSurrogateOptimiser
from pymoo.core.problem import ElementwiseProblem

class MyProblem(ElementwiseProblem):

    def __init__(self):
        super().__init__(n_var=2,
                         n_obj=2,
                         xl=np.array([-2,-2]),
                         xu=np.array([2,2]))

    def _evaluate(self, x, out, *args, **kwargs):
        f1 = 100 * (x[0]**2 + x[1]**2)
        f2 = (x[0]-1)**2 + x[1]**2
        out["F"] = [f1, f2]

problem = MyProblem()
optimi = MultiSurrogateOptimiser(problem, [0,0], [700,12])
out = optimi.solve(n_iterations=100, display_pareto_front=True, n_init_samples=20, sample_exponent=3, acquisition_func=Tchebicheff([0,0],[700,12]))

Will return a Pareto set approximation:

MyProblem

For the multi-objective benchmark problem DTLZ5:

from pymoo.problems import get_problem
problem = get_problem("dtlz5", n_obj=2, n_var=5)
optimi = MultiSurrogateOptimiser(problem, [0,0], [1.3,1.3])
out = optimi.solve(n_iterations=100, display_pareto_front=True, n_init_samples=20, sample_exponent=3, acquisition_func=Tchebicheff([0,0],[1.3,1.3])) 

Will return:

DTLZ5

The output results is a tuple containing:

  • Solutions on the Pareto front approximation.
  • The corresponding inputs to the solutions on the Pareto front.
  • All evaluated solutions.
  • All inputs used in the search.

Key Features

Mono and multi-surrogate:

Two optimisers based on differing methods.

Choice of acquisition/aggragation functions:

In mono-surrogate MOBO, scalarisation functions are used to aggregate objective vectors in a single value that can be used by the optimsier. In multi-surrogate MOBO, scalarisation functions are used as convergence measures to select sample points. This package contains 10 scalarisation functions that can be used in the above mentioned contexts. Options Include:

  • Weighted Sum
  • Tchebicheff
  • Modified Tchebicheff
  • Augmented Tchebicheff
  • Weighted Norm
  • Weighted Power
  • Weighted Product
  • PBI
  • IPBI
  • Exponential Weighted Criterion

They are written so they can be used in any context.

Experimental Parameters

Various experimental parameters can be customised.

Requirements

  • numpy
  • scipy
  • pygmo
  • pymoo

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optimobo-0.1.5.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

optimobo-0.1.5-py3-none-any.whl (12.5 kB view details)

Uploaded Python 3

File details

Details for the file optimobo-0.1.5.tar.gz.

File metadata

  • Download URL: optimobo-0.1.5.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.6

File hashes

Hashes for optimobo-0.1.5.tar.gz
Algorithm Hash digest
SHA256 9687f02f6b475c7c45efcb3db8883e3df6bfb1f6057bd2d22f9e068289d55e06
MD5 453c4d850140fdef6827cdad9706cbf8
BLAKE2b-256 726e1272cf2af3f9584da874065612e502769436137325ad1f4734150f363d28

See more details on using hashes here.

File details

Details for the file optimobo-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: optimobo-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 12.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.6

File hashes

Hashes for optimobo-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 082c8a7addaf1f13075a012005b2bf6a518f127fc2ad58f65914620e36e7c230
MD5 5ae081b95a101f452ee539aa5ac07301
BLAKE2b-256 4ea67ab2a638b64172ccf5ebb83c204ea64f44e2735190ac3342235ca76d3dd8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page