Skip to main content

A rational and extensible algorithm for solving multi-objective optimization problems

Project description

mobo

Introduction

mobo is an algorithm to solve multi-objective problems without imposing the bias of weights. Rather than converging to a single 'optimal' solution, mobo produces an ensamble of rational solutions which may then be further downselected according to the constraints of a particular application.

Installation

mobo can be installed simply with pip.

$ python3 -m pip install mobo

For testing or development you will need to clone the repo.

$ git clone https://github.com/seatonullberg/mobo.git

Example

An introductory example, fitting a polynomial, is available in the examples directory. Snippets of that example will be shown here. The first step in preparing an optimizer is to define the parameters which are being optimized.

parameters = [
        Parameter("a", -1.0, 1.0),
        Parameter("b", -2.0, 0.0),
        Parameter("c", 0.0, 1.0)
]

Each parameter has a name, a lower bound, and an upper bound. Next, it is necessary to define the quantities of interest which will be evaluated with the fitted parameters.

qois = [
        QoI("pt0", evaluate_pt0, -0.062),
        QoI("pt1", evaluate_pt1, 0.05),
        QoI("pt2", evaluate_pt2, -1.4),
        QoI("pt3", evaluate_pt3, 2.0)
]

Each quantity of interest has a name, an associated evaluation function, and a target value. The evaluation function should take only a dict mapping parameter names to their values as its argument and return a float.

# target coefficients: 0.3, -1.2, 0.5
polynomial = lambda a, b, c, x: a*x**3 + b*x**2 + c*x

# target: -0.062
def evaluate_pt0(params):
    x = -0.1
    return polynomial(params["a"], params["b"], params["c"], x)

After these fundamental components are defined, the mobo-centric components should be chosen.

clusterer = KmeansClusterer(n_clusters=2)
error_calculator = SquaredErrorCalculator()
filters = [ParetoFilter(), PercentileFilter()]
projector = PCAProjector()

A clusterer is required to partition the parameter space, an error calculator is required to determine a cost, filters are used to purge poor parameterizations, and a projector is used to project a high dimensional parameter space down to 2D for more effective clustering. Since this algorithm is iterative, each iteration requires a 'local' configuration which is passed into a 'global' configuration before finally constructing the optimizer.

local_config = LocalConfiguration(
    n_samples, clusterer, error_calculator, filters, projector
)
local_configurations = [local_config for _ in range(n_iterations)]
global_config = GlobalConfiguration(
    n_samples, local_configurations, parameters, qois
)

Once the global configuration has been constructed, execution of the optimizer is trivial.

optimizer = Optimizer(global_config)
optimizer()

As an aside, you may notice that this optimizer is a callable object. I use this theme in many objects which only expose publicly a single method. After optimization you will be left with some data files representing the results of each iteration. If you were to visualize the final results, you should see something like this.

polynomial fit results

The black line is the target polynomial, the black dots are the points used as quantities of interest, the blue lines are the predictions from cluster 0 and the red lines are predictions from cluster 1. As you can see, the fit is quite close. Other choices of points to evaluate should yield similar results so long as they are near the local extrema. The following plot attempts to illustrate some of the internal mechanics.

polynomial cluster results

The points are parameters projected down onto their primary PCA vectors. Blue points correspond to parameterizations from cluster 0 and red from cluster 1. This view is essentially what the KDE sampler "sees" when it is selecting new parameterizations. This view helps to show why the parameters in different clusters are producing distinct predictions.

Algorithm Description

TODO

Acknowledgements

The implementation of this algorithm is based upon work I contributed to in pypospack. The pypospack project is the focus of this dissertation which you may be interested in reading for a more in-depth description of the mathematics involved or alternative applications of the algorithm.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mobo-1.0.0.tar.gz (11.0 kB view details)

Uploaded Source

Built Distribution

mobo-1.0.0-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file mobo-1.0.0.tar.gz.

File metadata

  • Download URL: mobo-1.0.0.tar.gz
  • Upload date:
  • Size: 11.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.5

File hashes

Hashes for mobo-1.0.0.tar.gz
Algorithm Hash digest
SHA256 89e2e7ea617825b55750ecafe37502d3da650abbbc80c8bc41708ff4731e21c1
MD5 1b5921cdfd5e6a40a9cca5c7dec9fe6b
BLAKE2b-256 557759cae7ffa7c83b4e9bd381248061e0454b3cc0d992fbb9c029cc1c9bb1f3

See more details on using hashes here.

File details

Details for the file mobo-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: mobo-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 13.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.5

File hashes

Hashes for mobo-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b7a91d8e6620696c2fc460e97e55d64eef20d3e3e28430a3615b5af91603b687
MD5 dd16073ba5c33bed73d721759d490190
BLAKE2b-256 8a5989c61e4c1954e11c78e472decf45180509e0d5c1a6296788150a763cee29

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page