Skip to main content

Meta heuristic optimization techniques for scikit-learn, XGBoost and Keras models

Project description

PyPI version PyPI license Downloads

Hyperactive

A Python package for meta-heuristic hyperparameter optimization of scikit-learn models for supervised learning. Hyperactive automates the search for hyperparameters by utilizing metaheuristics to efficiently explore the search space and provide a sufficiently good solution. Its API is similar to scikit-learn and allows for parallel computation. Hyperactive offers a small collection of the following meta-heuristic optimization techniques:

  • Random search
  • Simulated annealing
  • Particle swarm optimization

The multiprocessing will start n_jobs separate searches. These can operate independent of one another, which makes the workload perfectly parallel.

Installation

pip install hyperactive

Examples

A very basic example:

from sklearn.datasets import load_iris

from hyperactive import RandomSearch_Optimizer

iris_data = load_iris()
X = iris_data.data
y = iris_data.target

search_config = {
    "sklearn.ensemble.RandomForestClassifier": {"n_estimators": range(10, 100, 10)}
}

Optimizer = RandomSearch_Optimizer(search_config, 10)
Optimizer.fit(X, y)

Example with larger search space and testing:

from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split

from hyperactive import SimulatedAnnealing_Optimizer

iris_data = load_iris()
X = iris_data.data
y = iris_data.target

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.33)

# this defines the model and hyperparameter search space
search_config = {
    "sklearn.ensemble.RandomForestClassifier": {
        "n_estimators": range(10, 100, 10),
        "max_depth": [3, 4, 5, 6],
        "criterion": ["gini", "entropy"],
        "min_samples_split": range(2, 21),
        "min_samples_leaf": range(2, 21),
    }
}

Optimizer = SimulatedAnnealing_Optimizer(search_config, 100, n_jobs=4)

# search best hyperparameter for given data
Optimizer.fit(X_train, y_train)

# predict from test data
prediction = Optimizer.predict(X_test)

# calculate accuracy score
score = Optimizer.score(X_test, y_test)

Example with a feedforward neural network in keras (experimental):

from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split

from hyperactive import ParticleSwarm_Optimizer

breast_cancer_data = load_breast_cancer()
X = breast_cancer_data.data
y = breast_cancer_data.target

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.33)

# this defines the structure of the model and the search space in each layer
search_config = {
    "keras.compile.0": {"loss": ["binary_crossentropy"], "optimizer": ["adam"]},
    "keras.fit.0": {"epochs": [5], "batch_size": [100]},
    "keras.layers.Dense.1": {
        "units": range(5, 15),
        "activation": ["relu"],
        "kernel_initializer": ["uniform"],
    },
    "keras.layers.Dense.2": {
        "units": range(5, 15),
        "activation": ["relu"],
        "kernel_initializer": ["uniform"],
    },
    "keras.layers.Dense.3": {"units": [1], "activation": ["sigmoid"]},
}
Optimizer = ParticleSwarm_Optimizer(search_config, 3, cv=1)

# search best hyperparameter for given data
Optimizer.fit(X_train, y_train)

# predict from test data
prediction = Optimizer.predict(X_test)

# calculate accuracy score
score = Optimizer.score(X_test, y_test)

Hyperactive API

Classes:

RandomSearch_Optimizer(search_config, n_iter, scoring="accuracy", n_jobs=1, cv=5, verbosity=1, random_state=None, start_points=None)
SimulatedAnnealing_Optimizer(search_config, n_iter, scoring="accuracy", n_jobs=1, cv=5, verbosity=1, random_state=None, start_points=None, eps=1, t_rate=0.99)
ParticleSwarm_Optimizer(search_config, n_iter, scoring="accuracy", n_jobs=1, cv=5, verbosity=1, random_state=None, start_points=None, n_part=1, w=0.5, c_k=0.5, c_s=0.9)

General positional argument:

Argument Type Description
search_config dict hyperparameter search space to explore by the optimizer
n_iter int number of iterations to perform

General keyword arguments:

Argument Type Default Description
scoring str "accuracy" scoring for model evaluation
n_jobs int 1 number of jobs to run in parallel (-1 for maximum)
cv int 5 cross-validation
verbosity int 1 Shows model and scoring information
random_state int None The seed for random number generator
start_points dict None Hyperparameter configuration to start from

Specific keyword arguments (simulated annealing):

Argument Type Default Description
eps int 1 epsilon
t_rate float 0.99 cooling rate

Specific keyword arguments (particle swarm optimization):

Argument Type Default Description
n_part int 1 number of particles
w float 0.5 intertia factor
c_k float 0.8 cognitive factor
c_s float 0.9 social factor

General methods:

fit(self, X_train, y_train)
Argument Type Description
X_train array-like training input features
y_train array-like training target
predict(self, X_test)
Argument Type Description
X_test array-like testing input features
score(self, X_test, y_test)
Argument Type Description
X_test array-like testing input features
y_test array-like true values
export(self, filename)
Argument Type Description
filename str file name and path for model export

Project details


Release history Release notifications | RSS feed

This version

0.1.4

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

hyperactive-0.1.4-py3-none-any.whl (13.0 kB view details)

Uploaded Python 3

File details

Details for the file hyperactive-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: hyperactive-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 13.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.30.0 CPython/3.6.8

File hashes

Hashes for hyperactive-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e2cd9beba39ed6dae1a293bec28ca8fddb16e2dcd3cba1b8ff92295c28b857f8
MD5 79159fab599c44169e3602a46ec893b6
BLAKE2b-256 8224924b503c5ee7486876f3026316f9dba255bdc65946f2892917dc066d13f6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page