Skip to main content

HGDL Optimization

Project description

HGDL

PyPI Documentation Status HGDL CI Codecov PyPI - License DOI

HGDL is an API for HPC distributed function optimization. At the core, the algorithm uses local and global optimization and bump-function-based deflation to provide a growing list of unique optima of a differetniable functions. This tackles the common problem of non-uniquness of optimization problems, especially in machine learning.

Usage

The following demonstrates a simple usage of the HGDL API

import numpy as np
from hgdl.hgdl import HGDL as hgdl
from support_functions import *
import dask.distributed as distributed

bounds = np.array([[-500,500],[-500,500]])
#dask_client = distributed.Client("10.0.0.184:8786")
a = hgdl(schwefel, schwefel_gradient, bounds,
        hess = schwefel_hessian,
        #global_optimizer = "random",
        global_optimizer = "genetic",
        #global_optimizer = "gauss",
        local_optimizer = "dNewton",
        number_of_optima = 30000, info = True,
        args = (arr,brr), radius = None, num_epochs = 100)

x0 = np.random.uniform(low = bounds[:, 0], high = bounds[:,1],size = (20,2))
a.optimize(x0 = x0)

Credits

Main Developers: Marcus Noack (MarcusNoack@lbl.gov) and David Perryman. Several people from across the DOE national labs have given insights that led to the code in its current form. See AUTHORS for more details on that. HGDL is based on the HGDN algorithm by Noack and Funke.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hgdl-2.0.0.tar.gz (34.6 kB view hashes)

Uploaded Source

Built Distribution

hgdl-2.0.0-py3-none-any.whl (19.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page