Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization
Project description
DEHB: Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization
Welcome to DEHB, an algorithm for Hyperparameter Optimization (HPO). DEHB uses Differential Evolution (DE) under-the-hood as an Evolutionary Algorithm to power the black-box optimization that HPO problems pose.
dehb
is a python package implementing the DEHB algorithm. It offers an intuitive interface to optimize user-defined problems using DEHB.
Getting Started
Installation
pip install dehb
Using DEHB
DEHB allows users to either utilize the Ask & Tell interface for manual task distribution or leverage the built-in functionality (run
) to set up a Dask cluster autonomously. The following snippet offers a small look in to how to use DEHB. For further information, please refer to our getting started examples in our documentation.
optimizer = DEHB(
f=your_target_function,
cs=config_space,
dimensions=dimensions,
min_fidelity=min_fidelity,
max_fidelity=max_fidelity)
##### Using Ask & Tell
# Ask for next configuration to run
job_info = optimizer.ask()
# Run the configuration for the given fidelity. Here you can freely distribute the computation to any worker you'd like.
result = your_target_function(config=job_info["config"], fidelity=job_info["fidelity"])
# When you received the result, feed them back to the optimizer
optimizer.tell(job_info, result)
##### Using run()
# Run optimization for 1 bracket. Output files will be saved to ./logs
traj, runtime, history = optimizer.run(brackets=1)
Running DEHB in a parallel setting
For a more in-depth look in how-to run DEHB in a parallel setting, please have a look at our documentation.
Tutorials/Example notebooks
- 00 - A generic template to use DEHB for multi-fidelity Hyperparameter Optimization
- 01.1 - Using DEHB to optimize 4 hyperparameters of a Scikit-learn's Random Forest on a classification dataset
- 01.2 - Using DEHB to optimize 4 hyperparameters of a Scikit-learn's Random Forest on a classification dataset using Ask & Tell interface
- 02 - Optimizing Scikit-learn's Random Forest without using ConfigSpace to represent the hyperparameter space
- 03 - Hyperparameter Optimization for MNIST in PyTorch
To run PyTorch example: (note additional requirements)
python examples/03_pytorch_mnist_hpo.py \
--min_fidelity 1 \
--max_fidelity 3 \
--runtime 60 \
--verbose
Documentation
For more details and features, please have a look at our documentation.
Contributing
Any contribution is greaty appreciated! Please take the time to check out our contributing guidelines
To cite the paper or code
@inproceedings{awad-ijcai21,
author = {N. Awad and N. Mallik and F. Hutter},
title = {{DEHB}: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization},
pages = {2147--2153},
booktitle = {Proceedings of the Thirtieth International Joint Conference on
Artificial Intelligence, {IJCAI-21}},
publisher = {ijcai.org},
editor = {Z. Zhou},
year = {2021}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.