Skip to main content

A package to solve global optimization problems in small dimensions. The method is based on sampling algorithms, and specifically implements the high-resolution Langevin algorithm

Project description

Global Optimization through High-Resolution Sampling

This package provides functions to run a global optimization algorithm, specifically designed to explore the properties of high-dimensional functions through High-Resolution sampling. It is based on the following paper. The package includes tools for defining functions, setting optimization parameters, generating samples, and visualizing empirical probabilities.

Installation

The package is available through pip, and may be installed via:

pip install GlobalOptimizationHRLA

Setup

In order to use this package, you need to define:

  • The target function and its gradient.
  • An initial distribution for the search space.

Main Usage

1. Defining the Target Function and Gradient

The provided example uses the Rastrigin function as the target for optimization.

import numpy as np

d = 10
U = lambda x: d + np.linalg.norm(x) ** 2 - np.sum(np.cos(2 * np.pi * x))
dU = lambda x: 2 * x + 2 * np.pi * np.sin(2 * np.pi * x)

2. Sampling from an Initial Distribution

Define an initial distribution from which samples are generated:

initial = lambda: np.random.multivariate_normal(np.zeros(d) + 3, 10 * np.eye(d))

3. Running the Algorithm

To execute the global optimization algorithm, use the DNLA.Algorithm class.

import GlobalOptimizationHRLA as HRLA

algorithm = HRLA.Algorithm(d=d, M=100, N=10, K=14000, h=0.01, title=title, U=U, dU=dU, initial=initial)
samples_filename = algorithm.generate_samples(As=[1,2,3,4], sim_annealing=False)

Parameters:

  • d (int): Dimension of the search space.
  • M (int): Number of particles in the swarm.
  • N (int): Number of generations for resampling.
  • K (int): Total number of iterations to perform.
  • h (float): Step size for gradient descent.
  • title (str): Title for the optimization, useful for organizing saved data.
  • U (function): The target function to optimize.
  • dU (function): The gradient of the target function.
  • initial (function): The initial distribution for generating particles.
  • As (list): List of tolerances or annealing factors to adjust optimization.
  • sim_annealing (bool): Determines whether to apply simulated annealing (default is False).

Returns:

  • samples_filename (str): Path to the file where generated samples are saved.

4. Post-processing and Plotting Results

After running the optimization, use the PostProcessor object to analyze and plot the empirical probabilities from the generated samples.

postprocessor = HRLA.PostProcessor(samples_filename)

Parameters:

  • samples_filename (str): The filename containing the generated samples data.

The PostProcessor object provides multiple methods. One is the plot_empirical_probabilities method, which generates a plot of the empirical probabilities for different tolerances.

postprocessor.plot_empirical_probabilities(dpi=10, layout="32", tols=[1,2,3,4,5,6], running=False)

Parameters:

  • dpi (int): Resolution of the plot, in dots per inch.
  • layout (str): Layout of the plot, specified as a string. Must be one of ["13", "23", "32", "22"] (default is "23").
  • tols (list): List of tolerances for computing empirical probabilities (default is [1,2,3,4,5,6]).
  • running (bool): Whether to display the plot with a running average or not (default is False).

Another method is compute_tables, which generates tables of empirical means and standard deviations.

postprocessor.compute_tables(measured=[K], dpi=100, mode="mean", running="True"")

Parameters:

  • measured (list): List with iteration counts to measure the empirical probabilities.
  • dpi (int): Resolution of the plot, in dots per inch.
  • mode (str): Mode for computing the tables, specified as a string. Must be one of ["mean", "std", "best"] (default is "mean").
  • running (bool): Whether to display the results are computed with a running average or not (default is True).

Examples

Examples may in found in the /examples directory of the repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

globaloptimizationhrla-1.0.2.tar.gz (7.9 kB view details)

Uploaded Source

Built Distribution

GlobalOptimizationHRLA-1.0.2-py3-none-any.whl (12.0 kB view details)

Uploaded Python 3

File details

Details for the file globaloptimizationhrla-1.0.2.tar.gz.

File metadata

File hashes

Hashes for globaloptimizationhrla-1.0.2.tar.gz
Algorithm Hash digest
SHA256 fc37cd49f515907e6d56891938ca945e3389f593b1a694bea1cfdeb54cb7c0d5
MD5 e0af1b0719b5ca6fa72269bd6f872300
BLAKE2b-256 88264b951712ec7e9e2efd9a6ef1f97b0a9e039c441fd58691acaadbf5405dc2

See more details on using hashes here.

File details

Details for the file GlobalOptimizationHRLA-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for GlobalOptimizationHRLA-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 0943ded1d3baa01b900262f23f2e7e8622d463a272ccd865c4252ba4a6f716f7
MD5 32c7ebe51b997942d966d0a14a1a4634
BLAKE2b-256 8f52b4b1aca066f937257545e11c918a241d8421c247a0bb7872d0b5bb966876

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page