A toolset for distributed black-box hyperparameter optimisation.
Project description
Why Hypertunity
Hypertunity is a lightweight, high-level library for hyperparameter optimisation. Among others, it supports:
- Bayesian optimisation by wrapping GPyOpt,
- external or internal objective function evaluation by a scheduler, also compatible with Slurm,
- real-time visualisation of results in Tensorboard via the HParams plugin.
For the full set of features refer to the documentation.
Quick start
Define the objective function to optimise. For example, it can take the hyperparameters params
as input and
return a raw value score
as output:
import hypertunity as ht
def foo(**params) -> float:
# do some very costly computations
...
return score
To define the valid ranges for the values of params
we create a Domain
object:
domain = ht.Domain({
"x": [-10., 10.], # continuous variable within the interval [-10., 10.]
"y": {"opt1", "opt2"}, # categorical variable from the set {"opt1", "opt2"}
"z": set(range(4)) # discrete variable from the set {0, 1, 2, 3}
})
Then we set up the optimiser:
bo = ht.BayesianOptimisation(domain=domain)
And we run the optimisation for 10 steps. Each result is used to update the optimiser so that informed domain samples are drawn:
n_steps = 10
for i in range(n_steps):
samples = bo.run_step(batch_size=2, minimise=True) # suggest next samples
evaluations = [foo(**s.as_dict()) for s in samples] # evaluate foo
bo.update(samples, evaluations) # update the optimiser
Finally, we visualise the results in Tensorboard:
import hypertunity.reports.tensorboard as tb
results = tb.Tensorboard(domain=domain, metrics=["score"], logdir="path/to/logdir")
results.from_history(bo.history)
Even quicker start
A high-level wrapper class Trial
allows for seamless parallel optimisation
without bothering with scheduling jobs, updating optimisers and logging:
trial = ht.Trial(objective=foo,
domain=domain,
optimiser="bo",
reporter="tensorboard",
metrics=["score"])
trial.run(n_steps, batch_size=2, n_parallel=2)
Installation
Using PyPI
To install the base version run:
pip install hypertunity
To use the Tensorboard dashboard, build the docs or run the test suite you will need the following extras:
pip install hypertunity[tensorboard,docs,tests]
From source
Checkout the latest master and install locally:
git clone https://github.com/gdikov/hypertunity.git
cd hypertunity
pip install ./[tensorboard]
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file hypertunity-1.0.1.tar.gz
.
File metadata
- Download URL: hypertunity-1.0.1.tar.gz
- Upload date:
- Size: 29.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e662500f8001ed26af6af968f673114075dbb8ae76b6901b06d14d11f1323763 |
|
MD5 | 7b66324ba2b786d0761228dcccb81f07 |
|
BLAKE2b-256 | 25e794bbfbb471701bcfebae31b1075e099eff7c4d75cb7bd09bf42fd801f4cf |
File details
Details for the file hypertunity-1.0.1-py3-none-any.whl
.
File metadata
- Download URL: hypertunity-1.0.1-py3-none-any.whl
- Upload date:
- Size: 39.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6bb47b52c9c5b09238eb6455932170964ae5999dfa135cd1a882ca18841324a0 |
|
MD5 | 7225232b94551c55ec9f217e35dc1ca8 |
|
BLAKE2b-256 | ad46fe8b5b03c5ae3ec8a94c752e79e4e7c4094538e26ab2932ec77017771b1a |