Skip to main content

This framework provides tools for solving, but not limited to, continuous optimisation problems using a hyper-heuristic approach for customising metaheuristics.

Project description

customhys

Module Dependency Diagram
Customising optimisation metaheuristics via hyper-heuristic search (CUSTOMHyS). This framework provides tools for solving, but not limited to, continuous optimisation problems using a hyper-heuristic approach for customising metaheuristics. Such an approach is powered by a strategy based on Simulated Annealing. Also, several search operators serve as building blocks for tailoring metaheuristics. They were extracted from ten well-known metaheuristics in the literature.

Detailed information about this framework can be found in [1, 2]. Plus, the code for each module is well-documented.

🛠 Requirements:

Package Version (>=)
Python 3.8
NumPy 1.22.0
SciPy 1.5.0
matplotlib 3.2.2
tqdm 4.47.0
pandas 1.5.3
scikit-learn 1.2.2
TensorFlow* 2.8.0

*For Mac M1/M2, one may need to install TensorFlow via conda such as:

conda install -c apple tensorflow-deps

Further information can be found at Install TensorFlow on Mac M1/M2 with GPU support by D. Ganzaroli.

🧰 Modules

The modules that comprise this framework depend on some basic Python packages, as well as they liaise each other. The module dependency diagram is presented as follows:

Module Dependency Diagram

NOTE: Each module is briefly described below. If you require further information, please check the corresponding source code.

🤯 Problems (benchmark functions)

This module includes several benchmark functions as classes to be solved by using optimisation techniques. The class structure is based on Keita Tomochika's repository optimization-evaluation.

Source: benchmark_func.py

👯‍♂️ Population

This module contains the class Population. A Population object corresponds to a set of agents or individuals within a problem domain. These agents themselves do not explore the function landscape, but they know when to update the position according to a selection procedure.

Source: population.py

🦾 Search Operators (low-level heuristics)

This module has a collection of search operators (simple heuristics) extracted from several well-known metaheuristics in the literature. Such operators work over a population, i.e., modify the individuals' positions.

Source: operators.py

🤖 Metaheuristic (mid-level heuristic)

This module contains the Metaheuristic class. A metaheuristic object implements a set of search operators to guide a population in a search procedure within an optimisation problem.

Source: metaheuristic.py

👽 Hyper-heuristic (high-level heuristic)

This module contains the Hyperheuristic class. Similar to the Metaheuristic class, but in this case, a collection of search operators is required. A hyper-heuristic object searches within the heuristic space to find the sequence that builds the best metaheuristic for a specific problem.

Source: hyperheuristic.py

🏭 Experiment

This module contains the Experiment class. An experiment object can run several hyper-heuristic procedures for a list of optimisation problems.

Source: experiment.py

🗜️ Tools

This module contains several functions and methods utilised by many modules in this package.

Source: tools.py

🧠 Machine Learning

This module contains the implementation of Machine Learning models which can power a hyper-heuristic model from this framework. In particular, it is implemented a wrapper for a Neural Network model from Tensorflow. Also, contains auxiliar data structures which process sample of sequences to generate training data for Machine Learning models.

Source: machine_learning.py

💾 Data Structure

The experiments are saved in JSON files. The data structure of a saved file follows a particular scheme described below.

Expand structure

data_frame = {dict: N}
|-- 'problem' = {list: N}
|  |-- 0 = {str}
:  :
|-- 'dimensions' = {list: N}
|  |-- 0 = {int}
:  :
|-- 'results' = {list: N}
|  |-- 0 = {dict: 6}
|  |  |-- 'iteration' = {list: M}   
|  |  |  |-- 0 = {int}
:  :  :  :
|  |  |-- 'time' = {list: M}
|  |  |  |-- 0 = {float}
:  :  :  :
|  |  |-- 'performance' = {list: M}
|  |  |  |-- 0 = {float}
:  :  :  :
|  |  |-- 'encoded_solution' = {list: M}
|  |  |  |-- 0 = {int}
:  :  :  :
|  |  |-- 'solution' = {list: M}
|  |  |  |-- 0 = {list: C}
|  |  |  |  |-- 0 = {list: 3}
|  |  |  |  |  |-- search_operator_structure
:  :  :  :  :  :
|  |  |-- 'details' = {list: M}
|  |  |  |-- 0 = {dict: 4}
|  |  |  |  |-- 'fitness' = {list: R}
|  |  |  |  |  |-- 0 = {float}
:  :  :  :  :  :
|  |  |  |  |-- 'positions' = {list: R}
|  |  |  |  |  |-- 0 = {list: D}
|  |  |  |  |  |  |-- 0 = {float}
:  :  :  :  :  :  :
|  |  |  |  |-- 'historical' = {list: R}
|  |  |  |  |  |-- 0 = {dict: 5}
|  |  |  |  |  |  |-- 'fitness' = {list: I}
|  |  |  |  |  |  |  |-- 0 = {float}
:  :  :  :  :  :  :  :
|  |  |  |  |  |  |-- 'positions' = {list: I}
|  |  |  |  |  |  |  |-- 0 = {list: D}
|  |  |  |  |  |  |  |  |-- 0 = {float}
:  :  :  :  :  :  :  :  :
|  |  |  |  |  |  |-- 'centroid' = {list: I}
|  |  |  |  |  |  |  |-- 0 = {list: D}
|  |  |  |  |  |  |  |  |-- 0 = {float}
:  :  :  :  :  :  :  :  :
|  |  |  |  |  |  |-- 'radius' = {list: I}
|  |  |  |  |  |  |  |-- 0 = {float}
:  :  :  :  :  :  :  :
|  |  |  |  |  |  |-- 'stagnation' = {list: I}
|  |  |  |  |  |  |  |-- 0 = {int}
:  :  :  :  :  :  :  :
|  |  |  |  |-- 'statistics' = {dict: 10}
|  |  |  |  |  |-- 'nob' = {int}
|  |  |  |  |  |-- 'Min' = {float}
|  |  |  |  |  |-- 'Max' = {float}
|  |  |  |  |  |-- 'Avg' = {float}
|  |  |  |  |  |-- 'Std' = {float}
|  |  |  |  |  |-- 'Skw' = {float}
|  |  |  |  |  |-- 'Kur' = {float}
|  |  |  |  |  |-- 'IQR' = {float}
|  |  |  |  |  |-- 'Med' = {float}
|  |  |  |  |  |-- 'MAD' = {float}
:  :  :  :  :  :

where:

  • N is the number of files within data_files folder
  • M is the number of hyper-heuristic iterations (metaheuristic candidates)
  • C is the number of search operators in the metaheuristic (cardinality)
  • P is the number of control parameters for each search operator
  • R is the number of repetitions performed for each metaheuristic candidate
  • D is the dimensionality of the problem tackled by the metaheuristic candidate
  • I is the number of iterations performed by the metaheuristic candidate
  • search_operator_structure corresponds to [operator_name = {str}, control_parameters = {dict: P}, selector = {str}]

🏗️ Work-in-Progress

The following modules are available, but they may do not work. They are currently under developing.

🌡️ Characterisation

This module intends to provide metrics for characterising the benchmark functions.

Source: characterisation.py

📊 Visualisation

This module intends to provide several tools for plotting results from the experiments.

Source: visualisation.py

Sponsors

References

  1. J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, H. Terashima-Marín, and Y. Shi, CUSTOMHyS: Customising Optimisation Metaheuristics via Hyper-heuristic Search, SoftwareX, vol. 12, p. 100628, 2020.
  2. J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, S. E. Conant-Pablos, H. Terashima-Marín, H., and Y. Shi. Hyper-Heuristics to Customise Metaheuristics for Continuous Optimisation, Swarm and Evolutionary Computation, 100935.
  3. J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, S. E. Connat-Pablos, and H. Terashima-Marín, A Primary Study on Hyper-Heuristics to Customise Metaheuristics for Continuous Optimisation. CEC'2020.
  4. J. M. Cruz-Duarte, J. C. Ortiz-Bayliss, I. Amaya, Y. Shi, H. Terashima-Marín, and N. Pillay, Towards a Generalised Metaheuristic Model for Continuous Optimisation Problems, Mathematics, vol. 8, no. 11, p. 2046, Nov. 2020.
  5. J. M. Cruz-Duarte, J. C. Ortiz-Bayliss, I. Amaya, and N. Pillay, Global Optimisation through Hyper-Heuristics: Unfolding Population-Based Metaheuristics, Appl. Sci., vol. 11, no. 12, p. 5620, 2021.
  6. J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, N. Pillay. Automated Design of Unfolded Metaheuristics and the Effect of Population Size. 2021 IEEE Congress on Evolutionary Computation (CEC), 1155–1162, 2021.
  7. J. M. Tapia-Avitia, J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, H. Terashima-Marin, and N. Pillay. A Primary Study on Hyper-Heuristics Powered by Artificial Neural Networks for Customising Population-based Metaheuristics in Continuous Optimisation Problems, 2022 IEEE Congress on Evolutionary Computation (CEC), 2022.
  8. J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, N. Pillay. A Transfer Learning Hyper-heuristic Approach for Automatic Tailoring of Unfolded Population-based Metaheuristics, 2022 IEEE Congress on Evolutionary Computation (CEC), 2022.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

customhys-1.1.0.tar.gz (5.2 MB view details)

Uploaded Source

Built Distribution

customhys-1.1.0-py3-none-any.whl (212.9 kB view details)

Uploaded Python 3

File details

Details for the file customhys-1.1.0.tar.gz.

File metadata

  • Download URL: customhys-1.1.0.tar.gz
  • Upload date:
  • Size: 5.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.9

File hashes

Hashes for customhys-1.1.0.tar.gz
Algorithm Hash digest
SHA256 bc3a258fd6bf11afdbda4a8956e3d9edc5fe8d160a58f5f77aaa4b21a47cfa55
MD5 d9dd6049d67dcd7faea756e481700f9a
BLAKE2b-256 ce1868bd7463ed599dfa3c859534d24f098c07f3b8c8e6a7185ec6f493f5254e

See more details on using hashes here.

File details

Details for the file customhys-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: customhys-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 212.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.9

File hashes

Hashes for customhys-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 09fac237af38219c225564c0e307ccb4a69939cb7b78d101ca0917a7778a9e6c
MD5 7e7d0dcf986a312f1dd1c103f9bb621a
BLAKE2b-256 5717034a6757b82e8a57b0d29303cb33b10866cf1f8cc5f770d1aa0422584a46

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page