This framework provides tools for solving, but not limited to, continuous optimisation problems using a hyper-heuristic approach for customising metaheuristics.
Project description
customhys
Detailed information about this framework can be found in [1, 2]. Plus, the code for each module is well-documented.
🛠 Requirements:
- Check the requirements.txt file.
- For Apple Silicon, one may need to install TensorFlow via
condasuch as:
conda install -c apple tensorflow-deps
Further information can be found at Install TensorFlow on Mac M1/M2 with GPU support by D. Ganzaroli.
🧰 Modules
The modules that comprise this framework depend on some basic Python packages, as well as they liaise each other. The module dependency diagram is presented as follows:
NOTE: Each module is briefly described below. If you require further information, please check the corresponding source code.
🤯 Problems (benchmark functions)
This module includes several benchmark functions as classes to be solved by using optimisation techniques. The class structure is based on Keita Tomochika's repository optimization-evaluation.
Source: benchmark_func.py
👯♂️ Population
This module contains the class Population. A Population object corresponds to a set of agents or individuals within a problem domain. These agents themselves do not explore the function landscape, but they know when to update the position according to a selection procedure.
Source: population.py
🦾 Search Operators (low-level heuristics)
This module has a collection of search operators (simple heuristics) extracted from several well-known metaheuristics in the literature. Such operators work over a population, i.e., modify the individuals' positions.
Source: operators.py
🤖 Metaheuristic (mid-level heuristic)
This module contains the Metaheuristic class. A metaheuristic object implements a set of search operators to guide a population in a search procedure within an optimisation problem.
Source: metaheuristic.py
👽 Hyper-heuristic (high-level heuristic)
This module contains the Hyperheuristic class. Similar to the Metaheuristic class, but in this case, a collection of search operators is required. A hyper-heuristic object searches within the heuristic space to find the sequence that builds the best metaheuristic for a specific problem.
Source: hyperheuristic.py
🏭 Experiment
This module contains the Experiment class. An experiment object can run several hyper-heuristic procedures for a list of optimisation problems.
Source: experiment.py
🗜️ Tools
This module contains several functions and methods utilised by many modules in this package.
Source: tools.py
🧠 Machine Learning
This module contains the implementation of Machine Learning models which can power a hyper-heuristic model from this framework. In particular, it is implemented a wrapper for a Neural Network model from Tensorflow. Also, contains auxiliar data structures which process sample of sequences to generate training data for Machine Learning models.
Source: machine_learning.py
💾 Data Structure
The experiments are saved in JSON files. The data structure of a saved file follows a particular scheme described below.
Expand structure
data_frame = {dict: N}
|-- 'problem' = {list: N}
| |-- 0 = {str}
: :
|-- 'dimensions' = {list: N}
| |-- 0 = {int}
: :
|-- 'results' = {list: N}
| |-- 0 = {dict: 6}
| | |-- 'iteration' = {list: M}
| | | |-- 0 = {int}
: : : :
| | |-- 'time' = {list: M}
| | | |-- 0 = {float}
: : : :
| | |-- 'performance' = {list: M}
| | | |-- 0 = {float}
: : : :
| | |-- 'encoded_solution' = {list: M}
| | | |-- 0 = {int}
: : : :
| | |-- 'solution' = {list: M}
| | | |-- 0 = {list: C}
| | | | |-- 0 = {list: 3}
| | | | | |-- search_operator_structure
: : : : : :
| | |-- 'details' = {list: M}
| | | |-- 0 = {dict: 4}
| | | | |-- 'fitness' = {list: R}
| | | | | |-- 0 = {float}
: : : : : :
| | | | |-- 'positions' = {list: R}
| | | | | |-- 0 = {list: D}
| | | | | | |-- 0 = {float}
: : : : : : :
| | | | |-- 'historical' = {list: R}
| | | | | |-- 0 = {dict: 5}
| | | | | | |-- 'fitness' = {list: I}
| | | | | | | |-- 0 = {float}
: : : : : : : :
| | | | | | |-- 'positions' = {list: I}
| | | | | | | |-- 0 = {list: D}
| | | | | | | | |-- 0 = {float}
: : : : : : : : :
| | | | | | |-- 'centroid' = {list: I}
| | | | | | | |-- 0 = {list: D}
| | | | | | | | |-- 0 = {float}
: : : : : : : : :
| | | | | | |-- 'radius' = {list: I}
| | | | | | | |-- 0 = {float}
: : : : : : : :
| | | | | | |-- 'stagnation' = {list: I}
| | | | | | | |-- 0 = {int}
: : : : : : : :
| | | | |-- 'statistics' = {dict: 10}
| | | | | |-- 'nob' = {int}
| | | | | |-- 'Min' = {float}
| | | | | |-- 'Max' = {float}
| | | | | |-- 'Avg' = {float}
| | | | | |-- 'Std' = {float}
| | | | | |-- 'Skw' = {float}
| | | | | |-- 'Kur' = {float}
| | | | | |-- 'IQR' = {float}
| | | | | |-- 'Med' = {float}
| | | | | |-- 'MAD' = {float}
: : : : : :
where:
Nis the number of files within data_files folderMis the number of hyper-heuristic iterations (metaheuristic candidates)Cis the number of search operators in the metaheuristic (cardinality)Pis the number of control parameters for each search operatorRis the number of repetitions performed for each metaheuristic candidateDis the dimensionality of the problem tackled by the metaheuristic candidateIis the number of iterations performed by the metaheuristic candidatesearch_operator_structurecorresponds to[operator_name = {str}, control_parameters = {dict: P}, selector = {str}]
🏗️ Work-in-Progress
The following modules are available, but they may do not work. They are currently under developing.
🌡️ Characterisation
This module intends to provide metrics for characterising the benchmark functions.
Source: characterisation.py
📊 Visualisation
This module intends to provide several tools for plotting results from the experiments.
Source: visualisation.py
Sponsors
References
Seminal Papers
The seminal papers that describe the framework's theoretical background and software implementation are:
- J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, H. Terashima-Marín, and Y. Shi, CUSTOMHyS: Customising Optimisation Metaheuristics via Hyper-heuristic Search, SoftwareX, vol. 12, p. 100628, 2020.
- J. M. Cruz-Duarte, J. C. Ortiz-Bayliss, I. Amaya, Y. Shi, H. Terashima-Marín, and N. Pillay, Towards a Generalised Metaheuristic Model for Continuous Optimisation Problems, Mathematics, vol. 8, no. 11, p. 2046, Nov. 2020.
- J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, S. E. Connat-Pablos, and H. Terashima-Marín, A Primary Study on Hyper-Heuristics to Customise Metaheuristics for Continuous Optimisation. 2020 IEEE Congress on Evolutionary Computation (CEC), 2020.
- J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, S. E. Conant-Pablos, H. Terashima-Marín, H., and Y. Shi. Hyper-Heuristics to Customise Metaheuristics for Continuous Optimisation, Swarm and Evolutionary Computation, 100935.
Published Journal Papers
These are the journal articles that have been published using this framework:
- J. M. Tapia-Avitia, J. M. Cruz‐Duarte, I. Amaya, J. C. Ortiz-Bayliss, H. Terashima-Marín, and N. Pillay, Analysing Hyper-Heuristics based on Neural Networks for the Automatic Design of Population-based Metaheuristics in Continuous Optimisation Problems, Swarm and Evolutionary Computation, 89, 101616, 2024.
- D. F. Zambrano-Gutierrez, G. H. Valencia-Rivera, J. G. Avina-Cervantes, I. Amaya, and J. M. Cruz-Duarte, Designing Heuristic-Based Tuners for Fractional-Order PID Controllers in Automatic Voltage Regulator Systems Using a Hyper-heuristic Approach, Fractal Fract, 2024.
- D. F. Zambrano-Gutierrez, J. M. Cruz-Duarte, J. G. Avina-Cervantes, J. C. Ortiz-Bayliss, J. J. Yanez-Borjas, and I. Amaya, Automatic Design of Metaheuristics for Practical Engineering Applications, IEEE Access., vol. 11, pp. 7262-7276, 2023.
- J. M. Cruz-Duarte, J. C. Ortiz-Bayliss, I. Amaya, and N. Pillay, Global Optimisation through Hyper-Heuristics: Unfolding Population-Based Metaheuristics, Appl. Sci., vol. 11, no. 12, p. 5620, 2021.
Presented Conference Papers
These are the conference articles that have been presented using this framework:
- D. F. Zambrano-Gutierrez, J. M. Cruz-Duarte, J. C. Ortiz-Bayliss, I. Amaya, and J. G. Avina-Cervantes, Beyond Traditional Tuning: Unveiling Metaheuristic Operator Trends in PID Control Tuning for Automatic Voltage Regulation, 2024 IEEE Congress on Evolutionary Computation (CEC), 2024.
- G. Pérez-Espinosa, J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, H. Terashima-Marín, and N. Pillay, Tailoring Metaheuristics for Designing Thermodynamic-Optimal Cooling Devices for Microelectronic Thermal Management Applications, 2024 IEEE Congress on Evolutionary Computation (CEC), 2024.
- D. Acosta-Ugalde, J. M. Cruz-Duarte, S. E. Conant-Pablos, and J. G. Falcón-Cardona, Beyond 'Novel' Metaphor-based Metaheuristics: An Interactive Algorithm Design Software, 2024 IEEE Congress on Evolutionary Computation (CEC), 2024.
- D. F. Zambrano-Gutierrez, A. C. Molina-Porras, J. G. Avina-Cervantes, R. Correa, and J. M. Cruz-Duarte, Designing Heuristic-Based Tuners for PID Controllers in Automatic Voltage Regulator Systems Using an Automated Hyper-Heuristic Approach, 2023 IEEE Symposium Series on Computational Intelligence (SSCI), Mexico City, Mexico, 2023, pp. 1263-1268.
- D. F. Zambrano-Gutierrez, A. C. Molina-Porras, E. Ovalle-Magallanes, I. Amaya, J. C. Ortiz-Bayliss, J. G. Avina-Cervantes, and J. M. Cruz-Duarte, SIGNRL: A Population-Based Reinforcement Learning Method for Continuous Control, 2023 IEEE Symposium Series on Computational Intelligence (SSCI), Mexico City, Mexico, 2023, pp. 1443-1448.
- D. F. Zambrano-Gutierrez, J. M. Cruz-Duarte, and H. Castañeda, Automatic Hyper-Heuristic to Generate Heuristic-based Adaptive Sliding Mode Controller Tuners for Buck-Boost Converters, in The Genetic and Evolutionary Computation Conference (GECCO), 2023, pp. 1-8. Nominated to Best Paper Award
- J. M. Tapia-Avitia, J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, H. Terashima-Marin, and N. Pillay. A Primary Study on Hyper-Heuristics Powered by Artificial Neural Networks for Customising Population-based Metaheuristics in Continuous Optimisation Problems, 2022 IEEE Congress on Evolutionary Computation (CEC), 2022.
- J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, N. Pillay. A Transfer Learning Hyper-heuristic Approach for Automatic Tailoring of Unfolded Population-based Metaheuristics, 2022 IEEE Congress on Evolutionary Computation (CEC), 2022.
- J. M. Cruz-Duarte, I. Amaya, J. C. Ortiz-Bayliss, N. Pillay. Automated Design of Unfolded Metaheuristics and the Effect of Population Size. 2021 IEEE Congress on Evolutionary Computation (CEC), 1155–1162, 2021.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file customhys-1.1.8.tar.gz.
File metadata
- Download URL: customhys-1.1.8.tar.gz
- Upload date:
- Size: 228.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
32f8fdaa6a6034321f8a16efadff74c186dc84c5282d8a6fa7753d23d8e7aa48
|
|
| MD5 |
fd179cd436f7e8646a4019fa3b2fcb55
|
|
| BLAKE2b-256 |
2e4a57e356829f09fbd22bb32264ccf34cf46b27c0bab5b87d3ad5ac97559b29
|
File details
Details for the file customhys-1.1.8-py3-none-any.whl.
File metadata
- Download URL: customhys-1.1.8-py3-none-any.whl
- Upload date:
- Size: 223.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9bed155aded0fdc8bbcedc14337ef19e7f52abe80e3c471180538a8bc1081cd8
|
|
| MD5 |
e715e40049923764f427a1a000abdb55
|
|
| BLAKE2b-256 |
18d9095c75a587a1a40eb0edf547bf0ab99edafa92e481679441418d605cbfe0
|