Skip to main content

Scalable asynchronous neural architecture and hyperparameter search for deep neural networks.

Project description

GitHub tag (latest by date) Build Status Documentation Status PyPI - License PyPI - Downloads

What is DeepHyper?

DeepHyper is an automated machine learning (AutoML) package for deep neural networks. It comprises two components: 1) Neural architecture search is an approach for automatically searching for high-performing the deep neural network architecture. 2) Hyperparameter search is an approach for automatically searching for high-performing hyperparameters for a given deep neural network. DeepHyper provides an infrastructure that targets experimental research in neural architecture and hyperparameter search methods, scalability, and portability across HPC systems. It comprises three modules: benchmarks, a collection of extensible and diverse benchmark problems; search, a set of search algorithms for neural architecture search and hyperparameter search; and evaluators, a common interface for evaluating hyperparameter configurations on HPC platforms.

Documentation

Deephyper documentation is on ReadTheDocs

Install instructions

From pip:

pip install deephyper

From github:

git clone https://github.com/deephyper/deephyper.git
cd deephyper/
pip install -e .

if you want to install deephyper with test and documentation packages:

# From Pypi
pip install 'deephyper[tests,docs]'

# From github
git clone https://github.com/deephyper/deephyper.git
cd deephyper/
pip install -e '.[tests,docs]'

Directory structure

benchmark/
    a set of problems for hyperparameter or neural architecture search which the user can use to compare our different search algorithms or as examples to build their own problems.
evaluator/
    a set of objects which help to run search on different systems and for different cases such as quick and light experiments or long and heavy runs.
search/
    a set of algorithms for hyperparameter and neural architecture search. You will also find a modular way to define new search algorithms and specific sub modules for hyperparameter or neural architecture search.
hps/
        hyperparameter search applications
nas/
        neural architecture search applications

How do I learn more?

Quickstart

Hyperparameter Search (HPS)

python -m deephyper.search.hps.ambs --problem deephyper.benchmark.hps.polynome2.Problem --run deephyper.benchmark.hps.polynome2.run

Neural Architecture Search (NAS)

python -m deephyper.search.nas.ppo_a3c_sync --problem deephyper.benchmark.nas.mnist1D.problem.Problem --run deephyper.search.nas.model.run.alpha.run

Who is responsible?

Currently, the core DeepHyper team is at Argonne National Laboratory:

Modules, patches (code, documentation, etc.) contributed by:

Citing DeepHyper

If you are referencing DeepHyper in a publication, please cite the following papers:

  • P. Balaprakash, M. Salim, T. Uram, V. Vishwanath, and S. M. Wild. DeepHyper: Asynchronous Hyperparameter Search for Deep Neural Networks. In 25th IEEE International Conference on High Performance Computing, Data, and Analytics. IEEE, 2018.

How can I participate?

Questions, comments, feature requests, bug reports, etc. can be directed to:

Patches are much appreciated on the software itself as well as documentation. Optionally, please include in your first patch a credit for yourself in the list above.

The DeepHyper Team uses git-flow to organize the development: Git-Flow cheatsheet. For tests we are using: Pytest.

Acknowledgements

  • Scalable Data-Efficient Learning for Scientific Domains, U.S. Department of Energy 2018 Early Career Award funded by the Advanced Scientific Computing Research program within the DOE Office of Science (2018--Present)
  • Argonne Leadership Computing Facility (2018--Present)
  • SLIK-D: Scalable Machine Learning Infrastructures for Knowledge Discovery, Argonne Computing, Environment and Life Sciences (CELS) Laboratory Directed Research and Development (LDRD) Program (2016--2018)

Copyright and license

Copyright © 2019, UChicago Argonne, LLC

DeepHyper is distributed under the terms of BSD License. See LICENSE

Argonne Patent & Intellectual Property File Number: SF-19-007

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deephyper-0.1.0.tar.gz (242.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deephyper-0.1.0-py2.py3-none-any.whl (356.2 kB view details)

Uploaded Python 2Python 3

File details

Details for the file deephyper-0.1.0.tar.gz.

File metadata

  • Download URL: deephyper-0.1.0.tar.gz
  • Upload date:
  • Size: 242.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.6.8

File hashes

Hashes for deephyper-0.1.0.tar.gz
Algorithm Hash digest
SHA256 156b8724a589eb89d05e6559f76aa7b4174eeca1492aec2b6af82dcd84fb70be
MD5 b599da7a5a5f08f8eff382c0e300b2b0
BLAKE2b-256 f7f619a42f2d812518374ed034a4b9ef8f4971574bf2489b6ea82bfe907bc29b

See more details on using hashes here.

File details

Details for the file deephyper-0.1.0-py2.py3-none-any.whl.

File metadata

  • Download URL: deephyper-0.1.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 356.2 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.6.8

File hashes

Hashes for deephyper-0.1.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 cc51f23e70d599b29614a486f9e69da6044a987a5ef1d2b73647430d02fd3d93
MD5 1437d1da279f90e55e4d3d6d5df46c5b
BLAKE2b-256 bd6273834db8b1e873547aa420dbb3540bc6463e32545b586fd7534ff0391ccf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page