Skip to main content

A toolset for distributed black-box hyperparameter optimisation.

Project description

CircleCI Documentation Status PyPI - License

Why Hypertunity

Hypertunity is a lightweight, high-level library for hyperparameter optimisation. Among others, it supports:

  • Bayesian optimisation by wrapping GPyOpt,
  • external or internal objective function evaluation by a scheduler, also compatible with Slurm,
  • real-time visualisation of results in Tensorboard via the HParams plugin.

For the full set of features refer to the documentation.

Quick start

Define the objective function to optimise. For example, it can take the hyperparameters params as input and return a raw value score as output:

import hypertunity as ht

def foo(**params) -> float:
    # do some very costly computations
    ...
    return score

To define the valid ranges for the values of params we create a Domain object:

domain = ht.Domain({
    "x": [-10., 10.],         # continuous variable within the interval [-10., 10.]
    "y": {"opt1", "opt2"},    # categorical variable from the set {"opt1", "opt2"}
    "z": set(range(4))        # discrete variable from the set {0, 1, 2, 3}
})

Then we set up the optimiser:

bo = ht.BayesianOptimisation(domain=domain)

And we run the optimisation for 10 steps. Each result is used to update the optimiser so that informed domain samples are drawn:

n_steps = 10
for i in range(n_steps):
    samples = bo.run_step(batch_size=2, minimise=True)      # suggest next samples
    evaluations = [foo(**s.as_dict()) for s in samples]     # evaluate foo
    bo.update(samples, evaluations)                         # update the optimiser

Finally, we visualise the results in Tensorboard:

import hypertunity.reports.tensorboard as tb

results = tb.Tensorboard(domain=domain, metrics=["score"], logdir="path/to/logdir")
results.from_history(bo.history)

Even quicker start

A high-level wrapper class Trial allows for seamless parallel optimisation without bothering with scheduling jobs, updating optimisers and logging:

trial = ht.Trial(objective=foo,
                 domain=domain,
                 optimiser="bo",
                 reporter="tensorboard",
                 metrics=["score"])
trial.run(n_steps, batch_size=2, n_parallel=2)

Installation

Using PyPI

To install the base version run:

pip install hypertunity

To use the Tensorboard dashboard, build the docs or run the test suite you will need the following extras:

pip install hypertunity[tensorboard,docs,tests]

From source

Checkout the latest master and install locally:

git clone https://github.com/gdikov/hypertunity.git
cd hypertunity
pip install ./[tensorboard]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hypertunity-1.0.dev0.tar.gz (28.8 kB view details)

Uploaded Source

Built Distribution

hypertunity-1.0.dev0-py3-none-any.whl (39.2 kB view details)

Uploaded Python 3

File details

Details for the file hypertunity-1.0.dev0.tar.gz.

File metadata

  • Download URL: hypertunity-1.0.dev0.tar.gz
  • Upload date:
  • Size: 28.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.3

File hashes

Hashes for hypertunity-1.0.dev0.tar.gz
Algorithm Hash digest
SHA256 fbf5a333ee2ad39f5bb671ed86ed1775da3b6066a3ac69dc046743ad9e75776b
MD5 da98ca06418b79f620038600cd457bea
BLAKE2b-256 78e1c6799b5b4621b842c745cad76a5130e98aef4cdcea5d55361634654c0ca7

See more details on using hashes here.

File details

Details for the file hypertunity-1.0.dev0-py3-none-any.whl.

File metadata

  • Download URL: hypertunity-1.0.dev0-py3-none-any.whl
  • Upload date:
  • Size: 39.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.3

File hashes

Hashes for hypertunity-1.0.dev0-py3-none-any.whl
Algorithm Hash digest
SHA256 fb3d9b91811de0497f25476a0d857d5206befb5d760d89735ef1330b2a14bcec
MD5 4b7611f3323234655a3e5504e3bad7d7
BLAKE2b-256 5fc650abd36da776a81bb3a1603db9a9187d8523a1353083d82365f9e01ecb72

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page