Hyperparameter Optimization using Genetic Algorithms.
Project description
Hyperoptim
Hyperparameter Optimization Using Genetic Algorithm.
Installation
OS X , Windows & Linux:
pip install hyperoptim
Usage example
Use for find best hyperparameter
from hyperoptim import GASearch, Hparams
import tensorflow as tf
from tensorflow import keras
(img_train, label_train), (img_test, label_test) = keras.datasets.fashion_mnist.load_data()
# define hyperparameter space
ht = Hparams()
hp_units = ht.Int('units', min_value=32, max_value=512, step=32)
hp_learning_rate = ht.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])
hp_activation = ht.Choice('activation', values=['relu', 'sigmoid', 'tanh'])
# create the list of hyperparameter
params = [hp_units, hp_learning_rate, hp_activation]
# define model
params = [hp_units, hp_learning_rate, hp_activation]
def model_builder(params):
model = keras.Sequential()
model.add(keras.layers.Flatten(input_shape=(28, 28)))
# here params[0] refer to hp_units and params[2] refer to hp_activation
model.add(keras.layers.Dense(units=params[0], activation=params[2]))
model.add(keras.layers.Dense(10))
model.compile(optimizer=keras.optimizers.Adam(learning_rate=params[1]),
loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['accuracy'])
return model
# intialize the GASearch
tuner = GASearch(model_builder=model_builder, params=params, objective='val_accuracy', weights=(1.0,), max_epochs=10, directory='my_dir', project_name='intro_to_kt')
# run the search
tuner.search(img_train, label_train, epochs=2, validation_split=0.2)
# Get the optimal hyperparameters
best_hps=tuner.get_best_hyperparameters()[0]
# Build the model with the optimal hyperparameters and train it on the data for 50 epochs
model = tuner.build(best_hps)
history = model.fit(img_train, label_train, epochs=2, validation_split=0.2)
val_acc_per_epoch = history.history['val_accuracy']
best_epoch = val_acc_per_epoch.index(max(val_acc_per_epoch)) + 1
print('Best epoch: %d' % (best_epoch,))
eval_result = model.evaluate(img_test, label_test)
print("[test loss, test accuracy]:", eval_result)
Development setup
For local development setup
git clone https://github.com/deepak7376/hyperoptim
cd hyperoptim
pip install -r requirements.txt
Meta
Deepak Yadav
Distributed under the MIT license. See LICENSE
for more information.
https://github.com/deepak7376/hypertune/blob/master/LICENSE
References
None
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
hyperoptim-0.0.1.tar.gz
(4.4 kB
view details)
File details
Details for the file hyperoptim-0.0.1.tar.gz
.
File metadata
- Download URL: hyperoptim-0.0.1.tar.gz
- Upload date:
- Size: 4.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 14dc4248b33d36448523d53d715e724f33d21a7fb397149e10fca4d36aa7b9f8 |
|
MD5 | 1392d3c612123f552adfb08813e519b0 |
|
BLAKE2b-256 | 02bd377dfda04800519b605c671a5b2eac27637ebeaaf467051235ca27853adf |