A simple tool for automatic parameter tuning.
Project description
LightTuner
A simple hyper-parameter optimization toolkit:
- hpo: automatic hyper-parameter tuning
- scheduler: automatic task resource scheduler
Installation
You can simply install it with pip
command line from the official PyPI site.
pip install lighttuner
Or install from latest source code as follows:
git clone https://github.com/opendilab/LightTuner.git
cd LightTuner
pip install . --user
Quick Start for HPO
Here is a simple example:
import random
import time
from ditk import logging
from lighttuner.hpo import hpo, R, M, uniform, randint
@hpo
def opt_func(v): # this function is still usable after decorating
x, y = v['x'], v['y']
time.sleep(5.0)
logging.info(f"This time's config: {v!r}") # log will be captured
if random.random() < 0.5: # randomly raise exception
raise ValueError('Fxxk this shxt') # retry is supported
return {
'result': x * y,
'sum': x + y,
}
if __name__ == '__main__':
logging.try_init_root(logging.DEBUG)
print(opt_func.bayes() # random algorithm
.max_steps(50) # max steps
.minimize(R['result']) # the maximize/minimize target you need to optimize,
.concern(M['time'], 'time_cost') # extra concerned values (from metrics)
.concern(R['sum'], 'sum') # extra concerned values (from return value of function)
.stop_when(R['result'] <= -800) # conditional stop is supported
.spaces( # search spaces
{
'x': uniform(-10, 110), # continuous space
'y': randint(-10, 20), # integer based space
'z': {
# 't': choice(['a', 'b', 'c', 'd', 'e']), # enumerate space
't': uniform(0, 10), # enumerate space is not supported in bayesian optimization
},
}
).run())
This optimization progress is parallel, which has n (number of cpus) workers in default. If you need to customize the
count of workers, just use max_workers(n)
method.
Quick Start for Scheduler
You can refer to lighttuner/scheduler/README.md
for more details.
Contributing
We appreciate all contributions to improve LightTuner
, both logic and system designs. Please refer to CONTRIBUTING.md
for more guides.
License
LightTuner
released under the Apache 2.0 license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lighttuner-0.0.1.tar.gz
.
File metadata
- Download URL: lighttuner-0.0.1.tar.gz
- Upload date:
- Size: 38.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1daf1ff5b7960f4502ad6fcf79d39c4ca612340eb91949a90855e884bdca84a3 |
|
MD5 | 248cd896b8f15dbadbbe8e37a7a2a282 |
|
BLAKE2b-256 | 09afde0e7efa33b045838ceef6ee9d07f315a4b0cd9aecdf25f7b8f381c78510 |
File details
Details for the file lighttuner-0.0.1-py3-none-any.whl
.
File metadata
- Download URL: lighttuner-0.0.1-py3-none-any.whl
- Upload date:
- Size: 48.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a626e8b7c4cbbd20f2ee3086fbbb8dd12b207bd26965920e43b690d64abde0a7 |
|
MD5 | 2faac9d9eb204120c159e447ed25bcd8 |
|
BLAKE2b-256 | 85e7cd76f00cd396c13e9dc64945573735c5e2eae58c058e52d6e07fb6b6b2ed |