Skip to main content

mloptimizer is a Python library for optimizing hyperparameters of machine learning algorithms using genetic algorithms.

Reason this release was yanked:

Conf files not included

Project description

mloptimizer_banner

Documentation Status PyPI version PyPI pyversions Tests Coverage Status

mloptimizer is a Python library for optimizing hyperparameters of machine learning algorithms using genetic algorithms. With mloptimizer, you can find the optimal set of hyperparameters for a given machine learning model and dataset, which can significantly improve the performance of the model. The library supports several popular machine learning algorithms, including decision trees, random forests, and gradient boosting classifiers. The genetic algorithm used in mloptimizer provides an efficient and flexible approach to search for the optimal hyperparameters in a large search space.

Features

  • Easy to use
  • DEAP-based genetic algorithm ready to use with several machine learning algorithms
  • Adaptable to use with any machine learning algorithm that complies with the Scikit-Learn API
  • Default hyperparameter ranges
  • Default score functions for evaluating the performance of the model
  • Reproducibility of results

Advanced Features

  • Extensible with more machine learning algorithms that comply with the Scikit-Learn API
  • Customizable hyperparameter ranges
  • Customizable score functions
  • Optional mlflow compatibility for tracking the optimization process

Installation

It is recommended to create a virtual environment using the venv package. To learn more about how to use venv, check out the official Python documentation at https://docs.python.org/3/library/venv.html.

# Create the virtual environment
python -m venv myenv
# Activate the virtual environment
source myenv/bin/activate

To install mloptimizer, run:

pip install mloptimizer

You can get more information about the package installation at https://pypi.org/project/mloptimizer/.

Quickstart

Here's a simple example of how to optimize hyperparameters in a decision tree classifier using the iris dataset:

from mloptimizer.interfaces import GeneticSearch, HyperparameterSpaceBuilder
from sklearn.tree import DecisionTreeClassifier
from sklearn.datasets import load_iris

# 1) Load the dataset and get the features and target
X, y = load_iris(return_X_y=True)

# 2) Define the hyperparameter space (a default space is provided for some algorithms)
hyperparameter_space = HyperparameterSpaceBuilder.get_default_space(DecisionTreeClassifier)

# 3) Create the optimizer and optimize the classifier
opt = GeneticSearch(estimator_class=DecisionTreeClassifier,
                    hyperparam_space=hyperparameter_space)

# 4) Optimize the classifier, the optimization returns the best estimator found in the optimization process
# - 10 generations starting with a population of 10 individuals, other parameters are set to default
opt.fit(X, y, population_size=10, generations=10)

print(opt.best_estimator_)

Other algorithms can be used, such as RandomForestClassifier or XGBClassifier which have a default hyperparameter space defined in the library. Even if the algorithm is not included in the default hyperparameter space, you can define your own hyperparameter space following the documentation.

More details in the documentation.

Examples

Examples can be found in examples on readthedocs.io.

Dependencies

The following dependencies are used in mloptimizer:

  • Deap - Genetic Algorithms
  • XGBoost - Gradient boosting classifier
  • Scikit-Learn - Machine learning algorithms and utilities

Optional:

  • Keras - Deep learning library
  • mlflow - Tracking the optimization process

Documentation

The documentation for mloptimizer can be found in the project's wiki with examples, classes and methods reference.

Authors

  • Antonio Caparrini - Author - caparrini
  • Javier Arroyo Gallardo - Author - javiag

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mloptimizer-0.9.0.tar.gz (45.1 kB view details)

Uploaded Source

Built Distribution

mloptimizer-0.9.0-py3-none-any.whl (56.5 kB view details)

Uploaded Python 3

File details

Details for the file mloptimizer-0.9.0.tar.gz.

File metadata

  • Download URL: mloptimizer-0.9.0.tar.gz
  • Upload date:
  • Size: 45.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for mloptimizer-0.9.0.tar.gz
Algorithm Hash digest
SHA256 b4d5767a9d2d9b74fcf88b34dffff3ed980f6cc5003ac9919743178231e9c611
MD5 14b3f3a034e43b162d1d847e841a339e
BLAKE2b-256 42543dc026d5e4ffb1aa93c84f2506c69ee783840747072c1c0aae464286c0af

See more details on using hashes here.

File details

Details for the file mloptimizer-0.9.0-py3-none-any.whl.

File metadata

  • Download URL: mloptimizer-0.9.0-py3-none-any.whl
  • Upload date:
  • Size: 56.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for mloptimizer-0.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9c0291077a888e6b7dc7182b1426ce58f68bd6fc422e38d8a69872c5c9c94c6e
MD5 df8d23bc1fba9afdb85cdd84063209ef
BLAKE2b-256 9c08df3cf04c980c1463d9c60afc3f877fb20ac52b97c2e97da32e0dc0bca460

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page