Skip to main content

pypop7 (Pure-PYthon library of POPulation-based OPtimization)

Project description

pypop7 (Pure-PYthon library of POPulation-based black-box OPtimization)

GNU General Public License v3.0 gitter for pypop PyPI for pypop7 Documentation Status

PyPop7 is a Pure-PYthon library of POPulation-based OPtimization for single-objective, real-parameter, black-box problems (currently actively developed). Its main goal is to provide a unified interface and elegant implementations for Black-Box Optimization (BBO), particularly population-based optimizers, in order to facilitate research repeatability and also real-world applications.

drawing

More specifically, for alleviating the notorious curse of dimensionality of BBO (almost based on iterative sampling), the primary focus of PyPop7 is to cover their State-Of-The-Art (SOTA) implementations for Large-Scale Optimization (LSO), though many of their other versions and variants are also included here (for benchmarking/mixing purpose, and sometimes even for practical purpose).

How to Use PyPop7

The following three simple steps are enough to utilize the optimization power of PyPop7:

  1. Use pip to install pypop7 on the Python3-based virtual environment via venv or conda:
$ pip install pypop7

For simplicity, all required dependencies are automatically installed according to setup.cfg.

  1. Define your own objective function for the optimization problem at hand,

  2. Run one or more black-box optimizers from pypop7 on the given optimization problem:

import numpy as np  # for numerical computation, which is also the computing engine of pypop7

# 2. Define your own objective function for the optimization problem at hand:
#   the below example is Rosenbrock, the notorious test function in the optimization community
def rosenbrock(x):
    return 100 * np.sum(np.power(x[1:] - np.power(x[:-1], 2), 2)) + np.sum(np.power(x[:-1] - 1, 2))

# define the fitness (cost) function and also its settings
ndim_problem = 1000
problem = {'fitness_function': rosenbrock,  # cost function
           'ndim_problem': ndim_problem,  # dimension
           'lower_boundary': -5 * np.ones((ndim_problem,)),  # search boundary
           'upper_boundary': 5 * np.ones((ndim_problem,))}

# 3. Run one or more black-box optimizers from ```pypop7``` on the given optimization problem:
#   here we choose LM-MA-ES owing to its low complexity and metric-learning ability for LSO
from pypop7.optimizers.es.lmmaes import LMMAES
# define all the necessary algorithm options (which differ among different optimizers)
options = {'fitness_threshold': 1e-10,  # terminate when the best-so-far fitness is lower than this threshold
           'max_runtime': 3600,  # 1 hours (terminate when the actual runtime exceeds it)
           'seed_rng': 0,  # seed of random number generation (which must be explicitly set for repeatability)
           'x': 4 * np.ones((ndim_problem,)),  # initial mean of search (mutation/sampling) distribution
           'sigma': 0.3,  # initial global step-size of search distribution
           'verbose_frequency': 500}
lmmaes = LMMAES(problem, options)  # initialize the optimizer
results = lmmaes.optimize()  # run its (time-consuming) search process
print(results)

Below DEMOs are given on a toy 2-dimensional minimization function, in order to visually show the very interesting/powerful evolutionary search process of MAES and LMCMAES:

MA-ES LM-CMA-ES
drawing drawing
Hooke-Jeeves (1961) Nelder-Mead (1965)
drawing drawing

A (Still Growing) List of Publicly Available Gradient-Free Optimizers (GFO)


large--scale--optimization: indicates the specific version for LSO (e.g., dimension >= 1000).

competitor: indicates the competitive (or de facto) version for relatively low-dimensional problems (though it may also work well under certain LSO circumstances).

baseline: indicates the baseline version for benchmarking purpose or for theoretical interest.


Design Philosophy

  • Respect for Beauty (Elegance)

    • From the problem-solving perspective, we empirically prefer to choose the best optimizer for the black-box optimization problem at hand. However, for the new problem, the best optimizer is often unknown in advance (without a prior knowledge). As a rule of thumb, we need to compare a (often small) set of all available/well-known optimizers and choose the best one from them according to some predefined performance criteria. From the research perspective, however, we like beautiful optimizers, though always keeping the “No Free Lunch” theorem in mind. Typically, the beauty of one optimizer comes from the following features: novelty (e.g., GA/PSO), competitive performance on at least one class of problems (e.g., BO), theoretical insights (e.g., CMA-ES/NES), clarity/simplicity (e.g., CEM/EDA), and repeatability.

      • "If there is a single dominant theme in this ..., it is that practical methods of numerical computation can be simultaneously efficient, clever, and — important — clear." (From Press, W.H., Teukolsky, S.A., Vetterling, W.T. and Flannery, B.P., 2007. Numerical recipes: The art of scientific computing. Cambridge University Press.)
    • If you find any BBO/DFO to meet the above standard, welcome to launch issues or pulls. We will consider it to be included in the pypop library. Note that any superficial imitation to the above well-established optimizers ('Old Wine in a New Bottle') will be NOT considered.

  • Respect for Diversity

    • Given the universality of black-box optimization (BBO) in science and engineering, different research communities have designed different methods and continue to increase. On the one hand, some of these methods may share more or less similarities. On the other hand, they may also show significant differences (w.r.t. motivations / objectives / implementations / practitioners). Therefore, we hope to cover such a diversity from different research communities such as artificial intelligence (particularly machine learning (evolutionary computation and zeroth-order optimization)), mathematical optimization/programming (particularly global optimization), operations research / management science, automatic control, open-source software, and perhaps others.
  • Respect for Originality

    • “It is both enjoyable and educational to hear the ideas directly from the creators”. (From Hennessy, J.L. and Patterson, D.A., 2019. Computer architecture: A quantitative approach (Sixth Edition). Elsevier.)

    • For each optimizer considered here, we expect to give its original/representative reference (including its good implementations/improvements). If you find some important reference missed here, please do NOT hesitate to contact us (we will be happy to add it if necessary).

  • Respect for Repeatability

    • For randomized search, properly controlling randomness is very crucial to repeat numerical experiments. Here we follow the Random Sampling suggestions from NumPy. In other worlds, you must explicitly set the random seed for each optimizer.

Computational Efficiency

For LSO, computational efficiency is an indispensable performance criterion of DFO in the post-Moore era. To obtain high-performance computation as much as possible, NumPy is heavily used in this library as the base of numerical computation along with SciPy. Sometimes, Numba is also utilized, in order to further accelerate the wall-clock time.

Development Guide

PEP 257 – Docstring Conventions

Since this library is built on the wonderful numerical computing library NumPy, we further use the Docstring Conventions from NumPy: numpydoc.

Reference

Research Support

This open-source Python library for black-box optimization is now supported by Shenzhen Fundamental Research Program under Grant No. JCYJ20200109141235597 (¥2,000,000 from 2021 to 2023), granted to Prof. Yuhui Shi (CSE, SUSTech @ Shenzhen, China), and actively developed by three of his group members (e.g., Qiqi Duan, Chang Shao, Guochen Zhou).

Now Zhuowei Wang from University of Technology Sydney (UTS) takes part in this library as one core developer (for testing). Mingyang Feng from University of Birmingham helps to search papers involved in this library.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pypop7-0.0.24.tar.gz (1.9 MB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page