Skip to main content

Hyperparameter Optimization using Genetic Algorithms.

Project description

Downloads Downloads Downloads


Hyperparameter Optimization Using Genetic Algorithm.


OS X , Windows & Linux:

pip install hyperoptim

Usage example

Use for find best hyperparameter

from hyperoptim import GASearch, Hparams
import tensorflow as tf
from tensorflow import keras

(img_train, label_train), (img_test, label_test) = keras.datasets.fashion_mnist.load_data()

# define hyperparameter space
ht = Hparams()
hp_units = ht.Int('units', min_value=32, max_value=512, step=32)
hp_learning_rate = ht.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])
hp_activation = ht.Choice('activation', values=['relu', 'sigmoid', 'tanh'])

# create the list of hyperparameter
params = [hp_units, hp_learning_rate, hp_activation]

# define model 
params = [hp_units, hp_learning_rate, hp_activation]
def model_builder(params):
    model = keras.Sequential()
    model.add(keras.layers.Flatten(input_shape=(28, 28)))
    # here params[0] refer to hp_units and params[2] refer to hp_activation
    model.add(keras.layers.Dense(units=params[0], activation=params[2]))
    return model

# intialize the GASearch
tuner = GASearch(model_builder=model_builder, params=params, objective='val_accuracy', weights=(1.0,), max_epochs=10, directory='my_dir', project_name='intro_to_kt')

# run the search         , label_train, epochs=2, validation_split=0.2)

# Get the optimal hyperparameters

# Build the model with the optimal hyperparameters and train it on the data for 50 epochs
model =
history =, label_train, epochs=2, validation_split=0.2)

val_acc_per_epoch = history.history['val_accuracy']
best_epoch = val_acc_per_epoch.index(max(val_acc_per_epoch)) + 1
print('Best epoch: %d' % (best_epoch,))

eval_result = model.evaluate(img_test, label_test)
print("[test loss, test accuracy]:", eval_result)

Development setup

For local development setup

git clone
cd hyperoptim
pip install -r requirements.txt


Deepak Yadav

Distributed under the MIT license. See LICENSE for more information.



Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperoptim-0.0.1.tar.gz (4.4 kB view hashes)

Uploaded source

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page