Skip to main content

A package to solve global optimization problems in small dimensions. The method is based on sampling algorithms, and specifically implements the high-resolution Langevin algorithm.

Project description

Global Optimization through High-Resolution Sampling

This package provides functions to run a global optimization algorithm, specifically designed to explore the properties of high-dimensional functions through High-Resolution sampling. It is based on the following paper. The package includes tools for defining functions, setting optimization parameters, generating samples, and visualizing empirical probabilities.

Installation

The package is available through pip, and may be installed via:

pip install GlobalOptimizationHRLA

Setup

In order to use this package, you need to define:

  • The target function and its gradient.
  • An initial distribution for the search space.

Main Usage

1. Defining the Target Function and Gradient

The provided example uses the Rastrigin function as the target for optimization.

import numpy as np

d = 10
U = lambda x: d + np.linalg.norm(x) ** 2 - np.sum(np.cos(2 * np.pi * x))
dU = lambda x: 2 * x + 2 * np.pi * np.sin(2 * np.pi * x)

2. Sampling from an Initial Distribution

Define an initial distribution from which samples are generated:

initial = lambda: np.random.multivariate_normal(np.zeros(d) + 3, 10 * np.eye(d))

3. Running the Algorithm

To execute the global optimization algorithm, use the DNLA.Algorithm class.

import GlobalOptimizationHRLA as HRLA

algorithm = HRLA.Algorithm(d=d, M=100, N=10, K=14000, h=0.01, title=title, U=U, dU=dU, initial=initial)
samples_filename = algorithm.generate_samples(As=[1,2,3,4], sim_annealing=False)

Parameters:

  • d (int): Dimension of the search space.
  • M (int): Number of particles in the swarm.
  • N (int): Number of generations for resampling.
  • K (int): Total number of iterations to perform.
  • h (float): Step size for gradient descent.
  • title (str): Title for the optimization, useful for organizing saved data.
  • U (function): The target function to optimize.
  • dU (function): The gradient of the target function.
  • initial (function): The initial distribution for generating particles.
  • As (list): List of tolerances or annealing factors to adjust optimization.
  • sim_annealing (bool): Determines whether to apply simulated annealing (default is False).

Returns:

  • samples_filename (str): Path to the file where generated samples are saved.

4. Post-processing and Plotting Results

After running the optimization, use the PostProcessor object to analyze and plot the empirical probabilities from the generated samples.

postprocessor = HRLA.PostProcessor(samples_filename)

Parameters:

  • samples_filename (str): The filename containing the generated samples data.

The PostProcessor object provides multiple methods. One is the plot_empirical_probabilities method, which generates a plot of the empirical probabilities for different tolerances.

postprocessor.plot_empirical_probabilities(dpi=10, layout="32", tols=[1,2,3,4,5,6], running=False)

Parameters:

  • dpi (int): Resolution of the plot, in dots per inch.
  • layout (str): Layout of the plot, specified as a string. Must be one of ["13", "23", "32", "22"] (default is "23").
  • tols (list): List of tolerances for computing empirical probabilities (default is [1,2,3,4,5,6]).
  • running (bool): Whether to display the plot with a running average or not (default is False).

Another method is compute_tables, which generates tables of empirical means and standard deviations.

postprocessor.compute_tables(measured=[K], dpi=100, mode="mean", running="True"")

Parameters:

  • measured (list): List with iteration counts to measure the empirical probabilities.
  • dpi (int): Resolution of the plot, in dots per inch.
  • mode (str): Mode for computing the tables, specified as a string. Must be one of ["mean", "std", "best"] (default is "mean").
  • running (bool): Whether to display the results are computed with a running average or not (default is True).

Examples

Examples may in found in the /examples directory of the repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

globaloptimizationhrla-1.0.7.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

GlobalOptimizationHRLA-1.0.7-py3-none-any.whl (8.4 kB view details)

Uploaded Python 3

File details

Details for the file globaloptimizationhrla-1.0.7.tar.gz.

File metadata

  • Download URL: globaloptimizationhrla-1.0.7.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for globaloptimizationhrla-1.0.7.tar.gz
Algorithm Hash digest
SHA256 0d6803ff16883f3895e5db41744a4f4cb38f5da1a60d5f9a1d5a0a640c9c0495
MD5 f1ffd5c975554ee8ed20e005388b3011
BLAKE2b-256 15493fbd69ece929df23fe60a09473f0aff15310247e7fb24ee0809c20558642

See more details on using hashes here.

File details

Details for the file GlobalOptimizationHRLA-1.0.7-py3-none-any.whl.

File metadata

File hashes

Hashes for GlobalOptimizationHRLA-1.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 3d86c419a64f1971fa27eef3416d865d4de366a026d3ae858f26a00f605af66a
MD5 6b458c4b57d1c5dbb4a7db29667d5980
BLAKE2b-256 679903fad712f44ba3d91840c6c03c8d31f415ebbd12d448d55a64396f7869d3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page