Skip to main content

The HMS (Hierarchic Memetic Strategy) is a composite global optimization strategy consisting of a multi-population evolutionary strategy and some auxiliary methods. The HMS makes use of a tree with a fixed maximal height and variable internal node degree. Each component population is governed by a particular evolutionary engine. This package provides a simple python implementation with examples of using different population engines.

Project description

pyhms

GitHub Test Badge codecov Documentation Status

pyhms is a Python implementation of Hierarchic Memetic Strategy (HMS).

The Hierarchic Memetic Strategy is a stochastic global optimizer designed to tackle highly multimodal problems. It is a composite global optimization strategy consisting of a multi-population evolutionary strategy and some auxiliary methods. The HMS makes use of a dynamically-evolving data structure that provides an organization among the component populations. It is a tree with a fixed maximal height and variable internal node degree. Each component population is governed by a particular optimization engine. This package provides a simple python implementation.

Installation

Installation can be done using pypi:

pip install pyhms

It's also possible to install the current main branch:

pip install git+https://github.com/agh-a2s/pyhms.git@main

Quick Start

from pyhms import minimize
import numpy as np

fun = lambda x: sum(x**2)
bounds = np.array([(-20, 20), (-20, 20)])
solution = minimize(
    fun=fun,
    bounds=bounds,
    maxfun=10000,
    log_level="debug",
    seed=42
)

pyhms provides an interface similar to scipy.optimize.minimize. This is the simplest way to run HMS with default parameters.

import numpy as np
from pyhms import (
    EALevelConfig,
    hms,
    get_NBC_sprout,
    DontStop,
    MetaepochLimit,
    SEA,
    Problem,
)

square_bounds = np.array([(-20, 20), (-20, 20)])
square_problem = Problem(lambda x: sum(x**2), maximize=False, bounds=square_bounds)

config = [
    EALevelConfig(
        ea_class=SEA,
        generations=2,
        problem=square_problem,
        pop_size=20,
        mutation_std=1.0,
        lsc=DontStop(),
    ),
    EALevelConfig(
        ea_class=SEA,
        generations=4,
        problem=square_problem,
        pop_size=10,
        mutation_std=0.25,
        sample_std_dev=1.0,
        lsc=DontStop(),
    ),
]
global_stop_condition = MetaepochLimit(limit=10)
sprout_condition = get_NBC_sprout(level_limit=4)
hms_tree = hms(config, global_stop_condition, sprout_condition)
print(hms_tree.summary())

Relevant literature

  • J. Sawicki, M. Łoś, M. Smołka, R. Schaefer. Understanding measure-driven algorithms solving irreversibly ill-conditioned problems. Natural Computing 21:289-315, 2022. doi: 10.1007/s11047-020-09836-w
  • J. Sawicki, M. Łoś, M. Smołka, J. Alvarez-Aramberri. Using Covariance Matrix Adaptation Evolutionary Strategy to boost the search accuracy in hierarchic memetic computations. Journal of computational science, 34, 48-54, 2019. doi: 10.1016/j.jocs.2019.04.005

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyhms-0.1.0.tar.gz (28.7 kB view hashes)

Uploaded Source

Built Distribution

pyhms-0.1.0-py3-none-any.whl (40.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page