Scalable asynchronous neural architecture and hyperparameter search for deep neural networks.
What is DeepHyper?
DeepHyper is an automated machine learning (AutoML) package for deep neural networks. It comprises two components: 1) Neural architecture search is an approach for automatically searching for high-performing the deep neural network search_space. 2) Hyperparameter search is an approach for automatically searching for high-performing hyperparameters for a given deep neural network. DeepHyper provides an infrastructure that targets experimental research in neural architecture and hyperparameter search methods, scalability, and portability across HPC systems. It comprises three modules: benchmarks, a collection of extensible and diverse benchmark problems; search, a set of search algorithms for neural architecture search and hyperparameter search; and evaluators, a common interface for evaluating hyperparameter configurations on HPC platforms.
Deephyper documentation is on ReadTheDocs
pip install deephyper
git clone https://github.com/deephyper/deephyper.git cd deephyper/ pip install -e .
if you want to install deephyper with test and documentation packages:
# From Pypi pip install 'deephyper[tests,docs]' # From github git clone https://github.com/deephyper/deephyper.git cd deephyper/ pip install -e '.[tests,docs]'
benchmark/ a set of problems for hyperparameter or neural architecture search which the user can use to compare our different search algorithms or as examples to build their own problems. evaluator/ a set of objects which help to run search on different systems and for different cases such as quick and light experiments or long and heavy runs. search/ a set of algorithms for hyperparameter and neural architecture search. You will also find a modular way to define new search algorithms and specific sub modules for hyperparameter or neural architecture search. hps/ hyperparameter search applications nas/ neural architecture search applications
How do I learn more?
GitHub repository: https://github.com/deephyper/deephyper
Hyperparameter Search (HPS)
deephyper hps ambs --evaluator ray --problem deephyper.benchmark.hps.polynome2.Problem --run deephyper.benchmark.hps.polynome2.run --n-jobs 1
Neural Architecture Search (NAS)
deephyper nas ambs --evaluator ray --problem deephyper.benchmark.nas.polynome2Reg.Problem --n-jobs 1
Who is responsible?
Currently, the core DeepHyper team is at Argonne National Laboratory:
- Prasanna Balaprakash firstname.lastname@example.org, Lead and founder
- Romain Egele email@example.com
- Misha Salim firstname.lastname@example.org
- Romit Maulik email@example.com
- Venkat Vishwanath firstname.lastname@example.org
- Stefan Wild email@example.com
Modules, patches (code, documentation, etc.) contributed by:
If you are referencing DeepHyper in a publication, please cite the following papers:
- P. Balaprakash, M. Salim, T. Uram, V. Vishwanath, and S. M. Wild. DeepHyper: Asynchronous Hyperparameter Search for Deep Neural Networks. In 25th IEEE International Conference on High Performance Computing, Data, and Analytics. IEEE, 2018.
- P. Balaprakash, R. Egele, M. Salim, S. Wild, V. Vishwanath, F. Xia, T. Brettin, and R. Stevens. Scalable reinforcement-learning-based neural architecture search for cancer deep learning research. In SC ’19: IEEE/ACM International Conference on High Performance Computing, Networking, Storage and Analysis, 2019.
How can I participate?
Questions, comments, feature requests, bug reports, etc. can be directed to:
Patches are much appreciated on the software itself as well as documentation. Optionally, please include in your first patch a credit for yourself in the list above.
- Scalable Data-Efficient Learning for Scientific Domains, U.S. Department of Energy 2018 Early Career Award funded by the Advanced Scientific Computing Research program within the DOE Office of Science (2018--Present)
- Argonne Leadership Computing Facility: This research used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
- SLIK-D: Scalable Machine Learning Infrastructures for Knowledge Discovery, Argonne Computing, Environment and Life Sciences (CELS) Laboratory Directed Research and Development (LDRD) Program (2016--2018)
Copyright and license
Copyright © 2019, UChicago Argonne, LLC
DeepHyper is distributed under the terms of BSD License. See LICENSE
Argonne Patent & Intellectual Property File Number: SF-19-007
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size deephyper-0.1.2-py2.py3-none-any.whl (364.7 kB)||File type Wheel||Python version py2.py3||Upload date||Hashes View|
|Filename, size deephyper-0.1.2.tar.gz (248.3 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for deephyper-0.1.2-py2.py3-none-any.whl