Skip to main content

A package of black-box optimization (BBO) algorithms.

Project description

Optimization Package (EGL, CMA, and IGL)

This package provides implementations for Enhanced Gradient Learning (EGL), Covariance Matrix Adaptation (CMA), and Iterative Gradient Learning (IGL) algorithms. It includes modules for datasets, distributions, trust regions, normalizers, stopping conditions, and customizable parameters. The minimize function initiates optimization with flexible configurations.

Table of Contents


Installation

pip install egl

Usage

Importing

The package supports the execution of EGL, CMA, and IGL algorithms, each configured with unique datasets, distributions, trust regions, and loss functions.

from egl import EGL, CMA, IGL, minimize

Using the Minimize Function

The minimize function provides an api (that matches scipy's) , with optimized configurations.

result = minimize(
    function=objective_function,
    x0=starting_point,
    args=[use second order optimization, use weighted mean gradient],
    bounds=[(lower_bound, upper_bound)],
    callback=callback_function,
)

Components

Datasets

EGL manages a dataset of tuples based on dataset of sampled points. There are many ways to decide how to couple the points, we provided multiple datasets options and of course you can create your own.

  • TuplesDataset: Utilizes paired data for optimization with enhanced generalization.
  • PairsInRangeDataset: Pair each point with every single point that was sampled in the same epoch.
  • PairsInEpsRangeDataset: Pair every point with every other point within a given epsilon euclidean distance.
  • PairFromDistributionDataset: A wrapper to sample points from distribution based on the points' values.

Distributions

We showed that egl improves results when calculating a weighted mean gradient based on the functions values. The WeightsDistributionBase defines a base class to create weights distributions.

  • SigmoidWeights: Applies a sigmoid function with custom gamma and quantile values to distribute weights.
  • QuantileWeights: Take certain quantile of minimum values, all other values are weight 0.

Trust Regions

We implemented two trust region with different normalizations.

  • TanhTrustRegion: Maps values within a hyperbolic tangent boundary.
  • LinearTrustRegion: Utilizes linear scaling for boundary constraints.

We found that genetic algorithms suffer when using the hyperbolic function to normalize the values.

Normalizers

We created AdaptedOutputUnconstrainedMapping to normalize the function value to ease the convergence of the algorithm. This is only for convergence base algorithm (i.e. EGL, IGL). The normalization is adaptable based on the the observed sampled values. You can modify the adaptation rate and the outliers' value threshold in the class's constructor.

Callbacks

Define callbacks to execute custom actions at various stages by create a callback `AlgorithmCallbackHandler. This class implements 4 methods:

  • on_algorithm_start: Called once when the algorithm initializes.
  • on_epoch_end: Execute actions at the end of each epoch (after each function step of the algorithm).
  • on_algorithm_update: Execute for each trust region update.
  • on_algorithm_end: Execute actions at the end of the algorithm.

You can create custom callback handlers and pass them to the train in the callback_handlers parameter.

egl.train(
    ...
    callback_handlers=[<callback_handlers>]
)

Stop Conditions

Define custom stopping conditions to end algorithm iterations under specific criteria:

  • AlgorithmStopCondition: Implement a condition by overriding the should_stop method to define custom end conditions.
  • Create instance of the class and pass it to the train in the stopping_conditions parameter.

Advanced Usage

Using the Train Function

The train method is essential to start training the EGL, CMA, and IGL models. Its parameters allow for fine-grained control over the exploration process, shrinking strategy, and improvement checks.

egl.train(
    epochs=100,
    exploration_size=50,
    num_loop_without_improvement=5,
    min_iteration_before_shrink=10,
    surrogate_model_training_epochs=60,
    warmup_minibatch=5,
    warmup_loops=6,
    stopping_conditions=[<stop_conditions>],
    callback_handlers=[<callback_handlers>]
)

Key Parameters

  • epochs: The total number of epochs for the training.
  • exploration_size: Number of new points explored each epoch.
  • num_loop_without_improvement: The allowed epochs without improvement before shrinking the trust region.
  • min_iteration_before_shrink: Minimum epochs before the first shrink action.
  • surrogate_model_training_epochs (default 60): Epochs to train the surrogate model.
  • warmup_minibatch and warmup_loops (defaults 5 and 6): Initialize the model with batch samples and warmup loops.
  • stopping_conditions: List of stop conditions for early termination.
  • callback_handlers: List of callback handlers for custom actions post-epoch and during updates.

Examples

Running CMA with Sigmoid Weights and Linear Trust Region

from egl import CMA, SigmoidWeights, LinearTrustRegion

# Set up CMA with Sigmoid distribution and Linear trust region
cma = CMA(
    trust_region=LinearTrustRegion(...),
    distribution=SigmoidWeights(gamma=-10, quantile=80),
    ...
)
result = cma.train(epochs=100, ...)

Running EGL with Pairs Dataset and Tanh Trust Region

from egl import EGL, PairsInRangeDataset, TanhTrustRegion

# Set up EGL with PairsInRangeDataset and Tanh trust region
egl = EGL(
    dataset_type=PairsInRangeDataset(...),
    trust_region=TanhTrustRegion(...),
    ...
)
result = egl.train(epochs=100, ...)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bbo_egl-1.1.0.tar.gz (17.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bbo_egl-1.1.0-py3-none-any.whl (21.9 kB view details)

Uploaded Python 3

File details

Details for the file bbo_egl-1.1.0.tar.gz.

File metadata

  • Download URL: bbo_egl-1.1.0.tar.gz
  • Upload date:
  • Size: 17.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for bbo_egl-1.1.0.tar.gz
Algorithm Hash digest
SHA256 89b0b2270b6a9d2620967d1ebda8af4525435d4e3c9a0f194936c8ef795b3350
MD5 deddaaee48d2d5d8579baa102e213b3d
BLAKE2b-256 d92a6a1d3aeaee43da9f71189ec1b3d4b651a5cfc5a430382ad50e9dcfb45dde

See more details on using hashes here.

Provenance

The following attestation bundles were made for bbo_egl-1.1.0.tar.gz:

Publisher: publish.yml on yedidyakfir/egl

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file bbo_egl-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: bbo_egl-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 21.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for bbo_egl-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9a68e181845c2cf11305eda95f313d7de97ca7b3b96eef85ddaeae41f24477b0
MD5 47cd70497a886082c44586eb95da397c
BLAKE2b-256 817a2c955a0ed541ba7e2182d057bad33443c24af3bcf96cdfcee8e595365479

See more details on using hashes here.

Provenance

The following attestation bundles were made for bbo_egl-1.1.0-py3-none-any.whl:

Publisher: publish.yml on yedidyakfir/egl

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page