Skip to main content

pypop7 (A Pure-PYthon library of POPulation-based OPtimization)

Project description

pypop (Pure-PYthon library of POPulation-based OPtimization)

GNU General Public License v3.0 gitter for pypop

PyPop is a Pure-PYthon library of POPulation-based OPtimization for single-objective, real-parameter, black-box problems (currently actively developed). Its goal is to provide a unified interface and also elegant implementations for Derivative-Free Optimization (DFO), particularly population-based optimizers, in order to facilitate research repeatability and also real-world applications.

drawing

For alleviating the notorious curse of dimensionality of DFO (based on iterative sampling), the main focus of PyPop is to cover their State-Of-The-Art implementations for Large-Scale Optimization (LSO), though their other versions and variants may be also included here (e.g., for benchmarking purpose, for mixing purpose, and sometimes even for practical purpose).

A (Still Growing) List of Publicly Available Gradient-Free Optimizers (GFO)


large--scale--optimization: indicates the specific version for LSO (e.g., dimension >= 1000).

competitor: indicates the competitive (or de facto) version for relatively low-dimensional problems (though it may also work well under certain LSO circumstances).

baseline: indicates the baseline version for benchmarking purpose or for theoretical interest.


Design Philosophy

  • Respect for Beauty (Elegance)

    • From the problem-solving perspective, we prefer to choose the (empirically) best optimizer for the given black-box problem. However, for the new problem, the (empirically) best optimizer is often unknown in advance (if without a prior knowledge). As a rule of thumb, we need to compare a (often small) set of all available/known optimizers and choose the best one from them according to some performance criteria. From the research perspective, however, we prefer to beautiful optimizers, though always keeping the “No Free Lunch” theorem in mind. Typically, the beauty of one optimizer comes from the following features: novelty (e.g., GA and PSO), competitive performance (e.g., on at least one class of problems), theoretical insights (e.g., NES/CMA-ES), clarity/simplicity (e.g., ease to understand and implement), and so on.

    • If you find any DFO to meet the above standard, welcome to launch issues or pulls. We will consider it to be included in the pypop library. Note that any superficial imitation to the above well-established optimizers ('Old Wine in a New Bottle') will be NOT considered.

  • Respect for Diversity

    • Given the universality of black-box optimization (BBO) in science and engineering, different research communities have designed different methods and continue to increase. On the one hand, some of these methods may share more or less similarities. On the other hand, they may also show significant differences (w.r.t. motivations / objectives / implementations / practitioners). Therefore, we hope to cover such a diversity from different research communities such as artificial intelligence (particularly machine learning (evolutionary computation and zeroth-order optimization)), mathematical optimization/programming (particularly global optimization), operations research / management science, automatic control, open-source software, and perhaps others.
  • Respect for Originality

    • “It is both enjoyable and educational to hear the ideas directly from the creators”. (From Hennessy, J.L. and Patterson, D.A., 2019. Computer architecture: A quantitative approach (Sixth Edition). Elsevier.)

    • For each optimizer considered here, we expect to give its original/representative reference (including its good implementations/improvements). If you find some important reference missed here, please do NOT hesitate to contact us (we will be happy to add it if necessary).

  • Respect for Repeatability

    • For randomized search, properly controlling randomness is very crucial to repeat numerical experiments. Here we follow the Random Sampling suggestions from NumPy. In other worlds, you must explicitly set the random seed for each optimizer.

Reference

Research Support

This open-source Python library for black-box optimization is now supported by Shenzhen Fundamental Research Program under Grant No. JCYJ20200109141235597 (¥2,000,000), granted to Prof. Yuhui Shi (CSE, SUSTech @ Shenzhen, China), and actively developed (from 2021 to 2023) by his group members (e.g., Qiqi Duan, Chang Shao, Guochen Zhou, and Youkui Zhang).

We also acknowledge the initial discussions and Python code efforts (dating back to about 2017) with Hao Tong from the research group of Prof. Yao (CSE, SUSTech @ Shenzhen, China).

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pypop7-0.0.2.tar.gz (52.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pypop7-0.0.2-py3-none-any.whl (78.1 kB view details)

Uploaded Python 3

File details

Details for the file pypop7-0.0.2.tar.gz.

File metadata

  • Download URL: pypop7-0.0.2.tar.gz
  • Upload date:
  • Size: 52.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.8.8

File hashes

Hashes for pypop7-0.0.2.tar.gz
Algorithm Hash digest
SHA256 f2da1f2b39d72a818fd717da87b3dfa131e90ad6d1c7d9b562a0d26f5e63ae6e
MD5 8a57adc4bd5b211960c8756e22f89efb
BLAKE2b-256 116e765b8c9e82968c10694856f1962d3ad29402dd27e9c528213c66e0ce6ba9

See more details on using hashes here.

File details

Details for the file pypop7-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: pypop7-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 78.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.8.8

File hashes

Hashes for pypop7-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 08d0f9fc0abad387871e5bb65ef9953762e9b14edaa8c62ed755d13b3c236913
MD5 ed58293f8032ab54c861c9f2295e58b1
BLAKE2b-256 97bca96076cb0f8b1ce269678c60fedbab91c9f0321a330b353b67e945c4926c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page