Skip to main content

Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization

Project description

DEHB: Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization

License Tests docs Coverage Status PyPI Static Badge arXiv

Welcome to DEHB, an algorithm for Hyperparameter Optimization (HPO). DEHB uses Differential Evolution (DE) under-the-hood as an Evolutionary Algorithm to power the black-box optimization that HPO problems pose.

dehb is a python package implementing the DEHB algorithm. It offers an intuitive interface to optimize user-defined problems using DEHB.

Getting Started

Installation

pip install dehb

Using DEHB

DEHB allows users to either utilize the Ask & Tell interface for manual task distribution or leverage the built-in functionality (run) to set up a Dask cluster autonomously. The following snippet offers a small look in to how to use DEHB. For further information, please refer to our getting started examples in our documentation.

optimizer = DEHB(
    f=your_target_function,
    cs=config_space, 
    dimensions=dimensions, 
    min_fidelity=min_fidelity, 
    max_fidelity=max_fidelity)

##### Using Ask & Tell
# Ask for next configuration to run
job_info = optimizer.ask()

# Run the configuration for the given fidelity. Here you can freely distribute the computation to any worker you'd like.
result = your_target_function(config=job_info["config"], fidelity=job_info["fidelity"])

# When you received the result, feed them back to the optimizer
optimizer.tell(job_info, result)

##### Using run()
# Run optimization for 1 bracket. Output files will be saved to ./logs
traj, runtime, history = optimizer.run(brackets=1)

Running DEHB in a parallel setting

For a more in-depth look in how-to run DEHB in a parallel setting, please have a look at our documentation.

Tutorials/Example notebooks

To run PyTorch example: (note additional requirements)

python examples/03_pytorch_mnist_hpo.py \
    --min_fidelity 1 \
    --max_fidelity 3 \
    --runtime 60 \
    --verbose

Documentation

For more details and features, please have a look at our documentation.

Contributing

Any contribution is greaty appreciated! Please take the time to check out our contributing guidelines


To cite the paper or code

@inproceedings{awad-ijcai21,
  author    = {N. Awad and N. Mallik and F. Hutter},
  title     = {{DEHB}: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization},
  pages     = {2147--2153},
  booktitle = {Proceedings of the Thirtieth International Joint Conference on
               Artificial Intelligence, {IJCAI-21}},
  publisher = {ijcai.org},
  editor    = {Z. Zhou},
  year      = {2021}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dehb-0.1.2.tar.gz (41.0 kB view details)

Uploaded Source

Built Distribution

DEHB-0.1.2-py3-none-any.whl (37.6 kB view details)

Uploaded Python 3

File details

Details for the file dehb-0.1.2.tar.gz.

File metadata

  • Download URL: dehb-0.1.2.tar.gz
  • Upload date:
  • Size: 41.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for dehb-0.1.2.tar.gz
Algorithm Hash digest
SHA256 7250f6f27e73b08af96f4f8a099b44d544b3f5a5dd0a37b044f6de43c1545188
MD5 2b3d00f0b561c13659b24ff2eadccacc
BLAKE2b-256 3a0b9f2b7b10c7ac1afa8747c1a3d03d31215a530e63876effc0f2960fcab466

See more details on using hashes here.

File details

Details for the file DEHB-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: DEHB-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 37.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for DEHB-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f109b18a051e93a2ce3f71be1b63c66b8f8baa9ed816fdb10fa0387e04e58d74
MD5 86f56fd57af67e8b887c88ece163a9bf
BLAKE2b-256 4cc5e1b242722d51481b13212165dbf50ce71b877a01aa8498e44515561d7766

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page