Skip to main content

A package to solve global optimization problems in small dimensions. The method is based on sampling algorithms, and specifically implements the high-resolution Langevin algorithm.

Project description

Global Optimization through High-Resolution Sampling

This package provides functions to run a global optimization algorithm, specifically designed to explore the properties of high-dimensional functions through High-Resolution sampling. It is based on the following paper. The package includes tools for defining functions, setting optimization parameters, generating samples, and visualizing empirical probabilities.

Installation

The package is available through pip, and may be installed via:

pip install GlobalOptimizationHRLA

Setup

In order to use this package, you need to define:

  • The target function and its gradient.
  • An initial distribution for the search space.

Main Usage

1. Defining the Target Function and Gradient

The provided example uses the Rastrigin function as the target for optimization.

import numpy as np

d = 10
U = lambda x: d + np.linalg.norm(x) ** 2 - np.sum(np.cos(2 * np.pi * x))
dU = lambda x: 2 * x + 2 * np.pi * np.sin(2 * np.pi * x)

2. Sampling from an Initial Distribution

Define an initial distribution from which samples are generated:

initial = lambda: np.random.multivariate_normal(np.zeros(d) + 3, 10 * np.eye(d))

3. Running the Algorithm

To execute the global optimization algorithm, use the DNLA.Algorithm class.

import GlobalOptimizationHRLA as HRLA

algorithm = HRLA.Algorithm(d=d, M=100, N=10, K=14000, h=0.01, title=title, U=U, dU=dU, initial=initial)
samples_filename = algorithm.generate_samples(As=[1,2,3,4], sim_annealing=False)

Parameters:

  • d (int): Dimension of the search space.
  • M (int): Number of particles in the swarm.
  • N (int): Number of generations for resampling.
  • K (int): Total number of iterations to perform.
  • h (float): Step size for gradient descent.
  • title (str): Title for the optimization, useful for organizing saved data.
  • U (function): The target function to optimize.
  • dU (function): The gradient of the target function.
  • initial (function): The initial distribution for generating particles.
  • As (list): List of tolerances or annealing factors to adjust optimization.
  • sim_annealing (bool): Determines whether to apply simulated annealing (default is False).

Returns:

  • samples_filename (str): Path to the file where generated samples are saved.

4. Post-processing and Plotting Results

After running the optimization, use the PostProcessor object to analyze and plot the empirical probabilities from the generated samples.

postprocessor = HRLA.PostProcessor(samples_filename)

Parameters:

  • samples_filename (str): The filename containing the generated samples data.

The PostProcessor object provides multiple methods. One is the plot_empirical_probabilities method, which generates a plot of the empirical probabilities for different tolerances.

postprocessor.plot_empirical_probabilities(dpi=10, layout="32", tols=[1,2,3,4,5,6], running=False)

Parameters:

  • dpi (int): Resolution of the plot, in dots per inch.
  • layout (str): Layout of the plot, specified as a string. Must be one of ["13", "23", "32", "22"] (default is "23").
  • tols (list): List of tolerances for computing empirical probabilities (default is [1,2,3,4,5,6]).
  • running (bool): Whether to display the plot with a running average or not (default is False).

Another method is compute_tables, which generates tables of empirical means and standard deviations.

postprocessor.compute_tables(measured=[K], dpi=100, mode="mean", running="True"")

Parameters:

  • measured (list): List with iteration counts to measure the empirical probabilities.
  • dpi (int): Resolution of the plot, in dots per inch.
  • mode (str): Mode for computing the tables, specified as a string. Must be one of ["mean", "std", "best"] (default is "mean").
  • running (bool): Whether to display the results are computed with a running average or not (default is True).

Examples

Examples may in found in the /examples directory of the repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

globaloptimizationhrla-1.1.0.tar.gz (10.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

GlobalOptimizationHRLA-1.1.0-py3-none-any.whl (13.0 kB view details)

Uploaded Python 3

File details

Details for the file globaloptimizationhrla-1.1.0.tar.gz.

File metadata

  • Download URL: globaloptimizationhrla-1.1.0.tar.gz
  • Upload date:
  • Size: 10.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.5

File hashes

Hashes for globaloptimizationhrla-1.1.0.tar.gz
Algorithm Hash digest
SHA256 cf59ddb21803167c7667193d408d1755703bf8056a27d6a25151afe43aa11ae6
MD5 cbe750561618c5919092bd5101f6a98d
BLAKE2b-256 6f4a3cb9e7c4cebb6a24226739cff02e352daab5c944b2217188f284ab9e5712

See more details on using hashes here.

File details

Details for the file GlobalOptimizationHRLA-1.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for GlobalOptimizationHRLA-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f970e7892d6ffb3b80f9b759316c454fa475037a266068d0bcc8a6b67d608c71
MD5 55258b9244ae8f567d5e0dec212e6268
BLAKE2b-256 bc874d983507094bc10df242b5b989ffb7cee50bda5af0162fcbfbb226468b53

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page