Skip to main content

Hyperparameter config file generator.

Project description

tuneconfig Py Versions PyPI version Build Status Documentation Status License: GPL v3

Hyperparameter config file generator and experiment runner.

Quickstart

$ pip install -U tuneconfig

Usage

Config file generator

import pprint

import tuneconfig

# Define parameter formatting function
def format_fn(param):
    fmt = {
        "batch_size": "batch",
        "horizon": "hr",
        "learning_rate": "lr",
        "optimizer": "opt",
        "epochs": None,
        "num_samples": None,
    }
    return fmt.get(param, param)

# Define a configuration template for grid search
config_iterator = tuneconfig.ConfigFactory({
    "batch_size": tuneconfig.grid_search([32, 128]),
    "horizon": 40,
    "learning_rate": tuneconfig.grid_search([0.01, 0.1]),
    "epochs": 1000,
    "optimizer": tuneconfig.grid_search(["Adam", "RMSProp"]),
    "num_samples": 10
    },
    format_fn=format_fn
)

# Iterate over config dicts
for idx, config in enumerate(config_iterator):
    name = config_iterator._trial_id(config)
    print(f"config {idx} ({name}):")
    pprint.pprint(config)
    print()

# Dump config dicts as JSON files
tmp = "/tmp/tuneconfig"
json_config_files = config_iterator.dump(tmp)
print(">> Saved config files:")
pprint.pprint(json_config_files)
config 0 (batch=32/hr=40/lr=0.01/opt=Adam):
{'batch_size': 32,
 'epochs': 1000,
 'horizon': 40,
 'learning_rate': 0.01,
 'num_samples': 10,
 'optimizer': 'Adam'}

config 1 (batch=32/hr=40/lr=0.01/opt=RMSProp):
{'batch_size': 32,
 'epochs': 1000,
 'horizon': 40,
 'learning_rate': 0.01,
 'num_samples': 10,
 'optimizer': 'RMSProp'}

config 2 (batch=32/hr=40/lr=0.1/opt=Adam):
{'batch_size': 32,
 'epochs': 1000,
 'horizon': 40,
 'learning_rate': 0.1,
 'num_samples': 10,
 'optimizer': 'Adam'}

config 3 (batch=32/hr=40/lr=0.1/opt=RMSProp):
{'batch_size': 32,
 'epochs': 1000,
 'horizon': 40,
 'learning_rate': 0.1,
 'num_samples': 10,
 'optimizer': 'RMSProp'}

config 4 (batch=128/hr=40/lr=0.01/opt=Adam):
{'batch_size': 128,
 'epochs': 1000,
 'horizon': 40,
 'learning_rate': 0.01,
 'num_samples': 10,
 'optimizer': 'Adam'}

config 5 (batch=128/hr=40/lr=0.01/opt=RMSProp):
{'batch_size': 128,
 'epochs': 1000,
 'horizon': 40,
 'learning_rate': 0.01,
 'num_samples': 10,
 'optimizer': 'RMSProp'}

config 6 (batch=128/hr=40/lr=0.1/opt=Adam):
{'batch_size': 128,
 'epochs': 1000,
 'horizon': 40,
 'learning_rate': 0.1,
 'num_samples': 10,
 'optimizer': 'Adam'}

config 7 (batch=128/hr=40/lr=0.1/opt=RMSProp):
{'batch_size': 128,
 'epochs': 1000,
 'horizon': 40,
 'learning_rate': 0.1,
 'num_samples': 10,
 'optimizer': 'RMSProp'}

>> Saved config files:
['/tmp/tuneconfig/batch=32/hr=40/lr=0.01/opt=Adam/config.json',
 '/tmp/tuneconfig/batch=32/hr=40/lr=0.01/opt=RMSProp/config.json',
 '/tmp/tuneconfig/batch=32/hr=40/lr=0.1/opt=Adam/config.json',
 '/tmp/tuneconfig/batch=32/hr=40/lr=0.1/opt=RMSProp/config.json',
 '/tmp/tuneconfig/batch=128/hr=40/lr=0.01/opt=Adam/config.json',
 '/tmp/tuneconfig/batch=128/hr=40/lr=0.01/opt=RMSProp/config.json',
 '/tmp/tuneconfig/batch=128/hr=40/lr=0.1/opt=Adam/config.json',
 '/tmp/tuneconfig/batch=128/hr=40/lr=0.1/opt=RMSProp/config.json']

License

Copyright (c) 2020 Thiago Pereira Bueno All Rights Reserved.

tuneconfig is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

tuneconfig is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with tuneconfig. If not, see http://www.gnu.org/licenses/.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tuneconfig-0.14.1.tar.gz (15.1 kB view details)

Uploaded Source

Built Distribution

tuneconfig-0.14.1-py3-none-any.whl (33.4 kB view details)

Uploaded Python 3

File details

Details for the file tuneconfig-0.14.1.tar.gz.

File metadata

  • Download URL: tuneconfig-0.14.1.tar.gz
  • Upload date:
  • Size: 15.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.6.9

File hashes

Hashes for tuneconfig-0.14.1.tar.gz
Algorithm Hash digest
SHA256 4dfa98ea021718d87e49d26fc103785cb0e1745f855203a872ff7cd938562405
MD5 947bad2c781c2f8815ecb5371439909f
BLAKE2b-256 d5c0eee2728ff105cccb56e04a50d0b4e0c42527c351964506d0ffe55946d8af

See more details on using hashes here.

File details

Details for the file tuneconfig-0.14.1-py3-none-any.whl.

File metadata

  • Download URL: tuneconfig-0.14.1-py3-none-any.whl
  • Upload date:
  • Size: 33.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.6.9

File hashes

Hashes for tuneconfig-0.14.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e7977401be53038b3952331e69fa47da6897b035daeed095853474d35455e04c
MD5 bfd51b8d9bd13c3d3365b5f72ab927a3
BLAKE2b-256 c819296c2b6c83c8c48bf115d18446894703745e10b5cff163975c7664c93136

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page