Skip to main content

MetaPerceptron: A Standardized Framework for Metaheuristic-Trained Multi-Layer Perceptron

Project description

MetaPerceptron


GitHub release Wheel PyPI version PyPI - Python Version PyPI - Status PyPI - Downloads Downloads Tests & Publishes to PyPI GitHub Release Date Documentation Status Chat GitHub contributors GitTutorial DOI License: GPL v3

MetaPerceptron (Metaheuristic-optimized Multi-Layer Perceptron) is a Python library that implements variants and the traditional version of Multi-Layer Perceptron models. These include Metaheuristic-optimized MLP models (GA, PSO, WOA, TLO, DE, ...) and Gradient Descent-optimized MLP models (SGD, Adam, Adelta, Adagrad, ...). It provides a comprehensive list of optimizers for training MLP models and is also compatible with the Scikit-Learn library. With MetaPerceptron, you can perform searches and hyperparameter tuning using the features provided by the Scikit-Learn library.

  • Free software: GNU General Public License (GPL) V3 license
  • Provided Estimator: MlpRegressor, MlpClassifier, MhaMlpRegressor, MhaMlpClassifier
  • Provided Utility: MhaMlpTuner and MhaMlpComparator
  • Total Metaheuristic-trained MLP Regressor: > 200 Models
  • Total Metaheuristic-trained MLP Classifier: > 200 Models
  • Total Gradient Descent-trained MLP Regressor: 12 Models
  • Total Gradient Descent-trained MLP Classifier: 12 Models
  • Supported performance metrics: >= 67 (47 regressions and 20 classifications)
  • Documentation: https://metaperceptron.readthedocs.io
  • Python versions: >= 3.8.x
  • Dependencies: numpy, scipy, scikit-learn, torch, mealpy, pandas, permetrics.

Citation Request

If you want to understand how Metaheuristic is applied to Multi-Layer Perceptron, you need to read the paper titled "Let a biogeography-based optimizer train your Multi-Layer Perceptron". The paper can be accessed at the following link

Please include these citations if you plan to use this library:

@software{nguyen_van_thieu_2023_10251022,
  author       = {Nguyen Van Thieu},
  title        = {MetaPerceptron: A Standardized Framework for Metaheuristic-Trained Multi-Layer Perceptron},
  month        = dec,
  year         = 2023,
  publisher    = {Zenodo},
  doi          = {10.5281/zenodo.10251021},
  url          = {https://github.com/thieu1995/MetaPerceptron}
}

@article{van2023mealpy,
  title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},
  author={Van Thieu, Nguyen and Mirjalili, Seyedali},
  journal={Journal of Systems Architecture},
  year={2023},
  publisher={Elsevier},
  doi={10.1016/j.sysarc.2023.102871}
}

@article{van2023groundwater,
  title={Groundwater level modeling using Augmented Artificial Ecosystem Optimization},
  author={Van Thieu, Nguyen and Barma, Surajit Deb and Van Lam, To and Kisi, Ozgur and Mahesha, Amai},
  journal={Journal of Hydrology},
  volume={617},
  pages={129034},
  year={2023},
  publisher={Elsevier}
}

@article{thieu2019efficient,
  title={Efficient time-series forecasting using neural network and opposition-based coral reefs optimization},
  author={Thieu Nguyen, Tu Nguyen and Nguyen, Binh Minh and Nguyen, Giang},
  journal={International Journal of Computational Intelligence Systems},
  volume={12},
  number={2},
  pages={1144--1161},
  year={2019}
}

Simple Tutorial

$ pip install metaperceptron==2.0.0
  • Check the version:
$ python
>>> import metaperceptron
>>> metaperceptron.__version__
  • Import all provided classes from MetaPerceptron
from metaperceptron import DataTransformer, Data
from metaperceptron import MhaMlpRegressor, MhaMlpClassifier, MlpRegressor, MlpClassifier
from metaperceptron import MhaMlpTuner, MhaMlpComparator
  • In this tutorial, we will use Genetic Algorithm to train Multi-Layer Perceptron network for classification task. For more complex examples and use cases, please check the folder examples.
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from metaperceptron import DataTransformer, MhaMlpClassifier

## Load the dataset
X, y = load_iris(return_X_y=True)

## Split train and test
X_train, y_train, X_test, y_test = train_test_split(X, y, test_size=0.2)

## Scale dataset with two methods: standard and minmax
dt = DataTransformer(scaling_methods=("standard", "minmax"))
X_train_scaled = dt.fit_transform(X_train)
X_test_scaled = dt.transform(X_test)

## Define Genetic Algorithm-trained Multi-Layer Perceptron
opt_paras = {"epoch": 100, "pop_size": 20}
model = MhaMlpClassifier(hidden_layers=(50,), act_names="Tanh", dropout_rates=None, act_output=None,
                         optim="BaseGA", optim_paras=opt_paras, obj_name="F1S", seed=42, verbose=True)
## Train the model
model.fit(X=X_train_scaled, y=y_train)

## Test the model
y_pred = model.predict(X_test)
print(y_pred)

## Print the score
print(model.score(X_test_scaled, y_test))

## Calculate some metrics
print(model.evaluate(y_true=y_test, y_pred=y_pred, list_metrics=["AS", "PS", "RS", "F2S", "CKS", "FBS"]))

Support (questions, problems)

Official Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metaperceptron-2.0.0.tar.gz (48.8 kB view details)

Uploaded Source

Built Distribution

metaperceptron-2.0.0-py3-none-any.whl (45.4 kB view details)

Uploaded Python 3

File details

Details for the file metaperceptron-2.0.0.tar.gz.

File metadata

  • Download URL: metaperceptron-2.0.0.tar.gz
  • Upload date:
  • Size: 48.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for metaperceptron-2.0.0.tar.gz
Algorithm Hash digest
SHA256 3a1d3bf5ef863bea00ac3475064b940f0bca80d813c3baefc3c3a47d548be6c4
MD5 03b58b65f69e14154fa64ae10f2e8dea
BLAKE2b-256 d28a386f0e85a1825ef8766de44a85b282dbf80e9c8b9461cbc10d1f94270c9c

See more details on using hashes here.

File details

Details for the file metaperceptron-2.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for metaperceptron-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 83b4edd3ae3bffb145f1d37ef7416c5011ae74df57c5b66f17b60e0706defed3
MD5 24cf87ac2a03bfd8c1737040da335ef8
BLAKE2b-256 f7164c588d0ab9416a623e0668a1b8272ae05bcd59e55f00f149326ffffceffa

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page