Skip to main content

Machine Learning Experiment Hyperparameter Optimization

Project description

Lightweight Hyperparameter Optimization 🚀

Pyversions PyPI version Code style: black Colab

Simple and intuitive hyperparameter optimization API for your Machine Learning Experiments (MLE). This includes simple grid and random search as well as sequential model-based optimization (SMBO) and a set of more unorthodox search algorithms (multi-objective via nevergrad and a coordinate-wise search). Portable hyperparameter spaces are available for real, integer and categorical-valued variables. The search strategies assume that the underlying objective is minimized (multiple by -1 if this is not the case). For a quickstart checkout the notebook blog.

The API 🎮

from mle_hyperopt import RandomSearch

# Instantiate random search class
strategy = RandomSearch(real={"lrate": {"begin": 0.1,
                                        "end": 0.5,
                                        "prior": "log-uniform"}},
                        integer={"batch_size": {"begin": 32,
                                                "end": 128,
                                                "prior": "uniform"}},
                        categorical={"arch": ["mlp", "cnn"]})

# Simple ask - eval - tell API
configs = strategy.ask(5)
values = [train_network(**c) for c in configs]
strategy.tell(configs, values)

Implemented Search Types 🔭

Search Type Description search_config
GridSearch Search over list of discrete values -
RandomSearch Random search over variable ranges refine_after, refine_top_k
SMBOSearch Sequential model-based optim. base_estimator, acq_function, n_initial_points
CoordinateSearch Coordinate-wise optim. with defaults order, defaults
NevergradSearch Multi-objective nevergrad wrapper optimizer, budget_size, num_workers

Variable Types & Hyperparameter Spaces 🌍

Variable Type Space Specification
real Real-valued Dict: begin, end, prior/bins (grid)
integer Integer-valued Dict: begin, end, prior/bins (grid)
categorical Categorical List: Values to search over

Installation ⏳

A PyPI installation is available via:

pip install mle-hyperopt

Alternatively, you can clone this repository and afterwards 'manually' install it:

git clone https://github.com/RobertTLange/mle-hyperopt.git
cd mle-hyperopt
pip install -e .

Further Options 🚴

Saving & Reloading Logs 🏪

# Storing & reloading of results from .pkl
strategy.save("search_log.json")
strategy = RandomSearch(..., reload_path="search_log.json")

# Or manually add info after class instantiation
strategy = RandomSearch(...)
strategy.load("search_log.json")

Search Decorator 🧶

from mle_hyperopt import hyperopt

@hyperopt(strategy_type="grid",
          num_search_iters=25,
          real={"x": {"begin": 0., "end": 0.5, "bins": 5},
                "y": {"begin": 0, "end": 0.5, "bins": 5}})
def circle(config):
    distance = abs((config["x"] ** 2 + config["y"] ** 2))
    return distance

strategy = circle()

Storing Configuration Files 📑

# Store 2 proposed configurations - eval_0.yaml, eval_1.yaml
strategy.ask(2, store=True)
# Store with explicit configuration filenames - conf_0.yaml, conf_1.yaml
strategy.ask(2, store=True, config_fnames=["conf_0.yaml", "conf_1.yaml"])

Retrieving Top Performers & Visualizing Results 📉

# Get the top k best performing configurations
strategy.get_best(top_k=4)

Refining the Search Space of Your Strategy 🪓

# Refine the search space after 5 iterations based on top 2 configurations
strategy = RandomSearch(real={"lrate": {"begin": 0.1,
                                        "end": 0.5,
                                        "prior": "uniform"}},
                        integer={"batch_size": {"begin": 1,
                                                "end": 5,
                                                "prior": "log-uniform"}},
                        categorical={"arch": ["mlp", "cnn"]},
                        search_config={"refine_after": 5,
                                       "refine_top_k": 2})

Development & Milestones for Next Release

You can run the test suite via python -m pytest -vv tests/. If you find a bug or are missing your favourite feature, feel free to contact me @RobertTLange or create an issue :hugs:. Here are some features I want to implement for the next release:

  • Add min vs max objective option to choose at strategy init
  • Add text to notebook + visualization for what is implemented
  • Allow space refinement for other strategies

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mle_hyperopt-0.0.1.tar.gz (17.2 kB view details)

Uploaded Source

Built Distribution

mle_hyperopt-0.0.1-py3-none-any.whl (22.3 kB view details)

Uploaded Python 3

File details

Details for the file mle_hyperopt-0.0.1.tar.gz.

File metadata

  • Download URL: mle_hyperopt-0.0.1.tar.gz
  • Upload date:
  • Size: 17.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.1 CPython/3.9.6

File hashes

Hashes for mle_hyperopt-0.0.1.tar.gz
Algorithm Hash digest
SHA256 247d9c65924547020545ac06f78139460df64f3bbf0c6cd5ff77eea7a273c796
MD5 378ee6ed18449f4971f8a017e3a75e38
BLAKE2b-256 87da748971bbdd5004a38ed15bbaf76ac638f90b29cca17431d83af1693fae0a

See more details on using hashes here.

File details

Details for the file mle_hyperopt-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: mle_hyperopt-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 22.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.1 CPython/3.9.6

File hashes

Hashes for mle_hyperopt-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7f8cd50f14cdeb9bd7f3cc07493cd423eedc697172e9f3220d992e99ed69f4d3
MD5 03c6ee257cb90352f90d2a76c8dcca6f
BLAKE2b-256 0552ed55910f173f8c615091b7be86f97bab3815ad676b09a840ad6d717741eb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page