Skip to main content

A simple tool for automatic parameter tuning.

Project description

LightTuner

PyPI PyPI - Python Version Loc Comments

Docs Deploy Code Test Badge Creation Package Release codecov

GitHub stars GitHub forks GitHub commit activity GitHub issues GitHub pulls Contributors GitHub license

A simple hyper-parameter optimization toolkit:

  • hpo: automatic hyper-parameter tuning
  • scheduler: automatic task resource scheduler

Installation

You can simply install it with pip command line from the official PyPI site.

pip install lighttuner

Or install from latest source code as follows:

git clone https://github.com/opendilab/LightTuner.git
cd LightTuner
pip install . --user

Quick Start for HPO

Here is a simple example:

import random
import time

from ditk import logging

from lighttuner.hpo import hpo, R, M, uniform, randint


@hpo
def opt_func(v):  # this function is still usable after decorating
    x, y = v['x'], v['y']
    time.sleep(5.0)
    logging.info(f"This time's config: {v!r}")  # log will be captured
    if random.random() < 0.5:  # randomly raise exception
        raise ValueError('Fxxk this shxt')  # retry is supported

    return {
        'result': x * y,
        'sum': x + y,
    }


if __name__ == '__main__':
    logging.try_init_root(logging.DEBUG)
    print(opt_func.bayes()  # random algorithm
          .max_steps(50)  # max steps
          .minimize(R['result'])  # the maximize/minimize target you need to optimize,
          .concern(M['time'], 'time_cost')  # extra concerned values (from metrics)
          .concern(R['sum'], 'sum')  # extra concerned values (from return value of function)
          .stop_when(R['result'] <= -800)  # conditional stop is supported
          .spaces(  # search spaces
        {
            'x': uniform(-10, 110),  # continuous space
            'y': randint(-10, 20),  # integer based space
            'z': {
                # 't': choice(['a', 'b', 'c', 'd', 'e']),  # enumerate space
                't': uniform(0, 10),  # enumerate space is not supported in bayesian optimization
            },
        }
    ).run())

This optimization progress is parallel, which has n (number of cpus) workers in default. If you need to customize the count of workers, just use max_workers(n) method.

Quick Start for Scheduler

You can refer to lighttuner/scheduler/README.md for more details.

Contributing

We appreciate all contributions to improve LightTuner, both logic and system designs. Please refer to CONTRIBUTING.md for more guides.

License

LightTuner released under the Apache 2.0 license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lighttuner-0.0.1.tar.gz (38.4 kB view details)

Uploaded Source

Built Distribution

lighttuner-0.0.1-py3-none-any.whl (48.2 kB view details)

Uploaded Python 3

File details

Details for the file lighttuner-0.0.1.tar.gz.

File metadata

  • Download URL: lighttuner-0.0.1.tar.gz
  • Upload date:
  • Size: 38.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for lighttuner-0.0.1.tar.gz
Algorithm Hash digest
SHA256 1daf1ff5b7960f4502ad6fcf79d39c4ca612340eb91949a90855e884bdca84a3
MD5 248cd896b8f15dbadbbe8e37a7a2a282
BLAKE2b-256 09afde0e7efa33b045838ceef6ee9d07f315a4b0cd9aecdf25f7b8f381c78510

See more details on using hashes here.

File details

Details for the file lighttuner-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: lighttuner-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 48.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for lighttuner-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a626e8b7c4cbbd20f2ee3086fbbb8dd12b207bd26965920e43b690d64abde0a7
MD5 2faac9d9eb204120c159e447ed25bcd8
BLAKE2b-256 85e7cd76f00cd396c13e9dc64945573735c5e2eae58c058e52d6e07fb6b6b2ed

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page