Skip to main content

A toolset for distributed black-box hyperparameter optimisation.

Project description

CircleCI Documentation Status GitHub

Why Hypertunity

Hypertunity is a lightweight, high-level library for hyperparameter optimisation. Among others, it supports:

  • Bayesian optimisation by wrapping GPyOpt,
  • external or internal objective function evaluation by a scheduler, also compatible with Slurm,
  • real-time visualisation of results in Tensorboard via the HParams plugin.

For the full set of features refer to the documentation.

Quick start

Define the objective function to optimise. For example, it can take the hyperparameters params as input and return a raw value score as output:

import hypertunity as ht

def foo(**params) -> float:
    # do some very costly computations
    ...
    return score

To define the valid ranges for the values of params we create a Domain object:

domain = ht.Domain({
    "x": [-10., 10.],         # continuous variable within the interval [-10., 10.]
    "y": {"opt1", "opt2"},    # categorical variable from the set {"opt1", "opt2"}
    "z": set(range(4))        # discrete variable from the set {0, 1, 2, 3}
})

Then we set up the optimiser:

bo = ht.BayesianOptimisation(domain=domain)

And we run the optimisation for 10 steps. Each result is used to update the optimiser so that informed domain samples are drawn:

n_steps = 10
for i in range(n_steps):
    samples = bo.run_step(batch_size=2, minimise=True)      # suggest next samples
    evaluations = [foo(**s.as_dict()) for s in samples]     # evaluate foo
    bo.update(samples, evaluations)                         # update the optimiser

Finally, we visualise the results in Tensorboard:

import hypertunity.reports.tensorboard as tb

results = tb.Tensorboard(domain=domain, metrics=["score"], logdir="path/to/logdir")
results.from_history(bo.history)

Even quicker start

A high-level wrapper class Trial allows for seamless parallel optimisation without bothering with scheduling jobs, updating optimisers and logging:

trial = ht.Trial(objective=foo,
                 domain=domain,
                 optimiser="bo",
                 reporter="tensorboard",
                 metrics=["score"])
trial.run(n_steps, batch_size=2, n_parallel=2)

Installation

Using PyPI

To install the base version run:

pip install hypertunity

To use the Tensorboard dashboard, build the docs or run the test suite you will need the following extras:

pip install hypertunity[tensorboard,docs,tests]

From source

Checkout the latest master and install locally:

git clone https://github.com/gdikov/hypertunity.git
cd hypertunity
pip install ./[tensorboard]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hypertunity-1.0.tar.gz (28.8 kB view details)

Uploaded Source

Built Distribution

hypertunity-1.0-py3-none-any.whl (39.2 kB view details)

Uploaded Python 3

File details

Details for the file hypertunity-1.0.tar.gz.

File metadata

  • Download URL: hypertunity-1.0.tar.gz
  • Upload date:
  • Size: 28.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.3

File hashes

Hashes for hypertunity-1.0.tar.gz
Algorithm Hash digest
SHA256 33ad6605b5a2fa25a4074c4506858cabed1160664507b96ef75e45766267a9ed
MD5 dd7dc565f8199c1e0ca6f4ffc68e85b9
BLAKE2b-256 550bd5b241f4810f08b31d24fd0d9053184daec732be2aefa53849fba72a7e19

See more details on using hashes here.

File details

Details for the file hypertunity-1.0-py3-none-any.whl.

File metadata

  • Download URL: hypertunity-1.0-py3-none-any.whl
  • Upload date:
  • Size: 39.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.3

File hashes

Hashes for hypertunity-1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fb3c281cd014b5137503192770d590ac364678550d925c5f011e2ddbc457bd1f
MD5 0aa67e6435ea04da7c0282717b9486ca
BLAKE2b-256 108265021cfc31dd4b3203e70e4f1978dc60bedeec94a357528d94cc71934b9a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page