Skip to main content

No project description provided

Project description

easyopt

easyopt is a super simple yet super powerful optuna-based Hyperparameters Optimization Framework that requires no coding.

Features

  • YAML Configuration
  • Distributed Parallel Optimization
  • Experiments Monitoring and Crash Recovering
  • Experiments Replicas
  • Real Time Pruning
  • A wide variety of sampling strategies
    • Tree-structured Parzen Estimator
    • CMA-ES
    • Grid Search
    • Random Search
  • A wide variety of pruning strategies
    • Asynchronous Successive Halving Pruning
    • Hyperband Pruning
    • Median Pruning
    • Threshold Pruning
  • A wide variety of DBMSs
    • Redis
    • SQLite
    • PostgreSQL
    • MySQL
    • Oracle
    • And many more

Installation

To install easyopt just type:

pip install easyopt

Example

easyopt expects that hyperparameters are passed using the command line arguments.

For example this problem has two hyperparameters x and y

import argparse

parser = argparse.ArgumentParser()

parser.add_argument("--x", type=float, required=True)
parser.add_argument("--y", type=float, required=True)

args = parser.parse_args()

def objective(x, y):
    return x**2 + y**2

F = objective(args.x ,args.y)

To integrate easyopt you just have to

  • import easyopt
  • Add easyopt.objective(...) to report the experiment objective function value

The above code becomes:

import argparse
import easyopt

parser = argparse.ArgumentParser()

parser.add_argument("--x", type=float, required=True)
parser.add_argument("--y", type=float, required=True)

args = parser.parse_args()

def objective(x, y):
    return x**2 + y**2

F = objective(args.x ,args.y)
easyopt.objective(F)

Next you have to create the easyopt.yml to define the problem search space, sampler, pruner, storage, etc.

command: python problem.py {args}
storage: sqlite:////tmp/easyopt-toy-problem.db
sampler: TPESampler
parameters:
  x:
    distribution: uniform
    low: -10
    high: 10
  y:
    distribution: uniform
    low: -10
    high: 10

You can find the compete list of distributions here (all the suggest_* functions)

Finally you have to create a study

easyopt create test-study

And run as many agents as you want

easyopt agent test-study

After a while the hyperparameter optimization will finish

Trial 0 finished with value: 90.0401543850028 and parameters: {'x': 5.552902529323713, 'y': 7.694506344453366}. Best is trial 0 with value: 90.0401543850028.
Trial 1 finished with value: 53.38635524683359 and parameters: {'x': 0.26609756303111, 'y': 7.301749607716118}. Best is trial 1 with value: 53.38635524683359.
Trial 2 finished with value: 64.41207387363161 and parameters: {'x': 7.706366704967074, 'y': 2.2414250115064167}. Best is trial 1 with value: 53.38635524683359.
...
...
Trial 53 finished with value: 0.5326245807950265 and parameters: {'x': -0.26584110075742917, 'y': 0.6796713102251005}. Best is trial 35 with value: 0.11134607529340049.
Trial 54 finished with value: 8.570230212116037 and parameters: {'x': 2.8425893061307295, 'y': 0.6999401751487438}. Best is trial 35 with value: 0.11134607529340049.
Trial 55 finished with value: 96.69479467451664 and parameters: {'x': -0.3606041968175481, 'y': -9.826736960342137}. Best is trial 35 with value: 0.11134607529340049.

YAML Structure

The YAML configuration file is structured as follows

command: <command>
storage: <storage>
sampler: <sampler>
pruner: <pruner>
direction: <direction>
replicas: <replicas>
parameters:
  parameter-1:
    distribution: <distribution>
    <arg1>: <value1>
    <arg2>: <value2>
    ...
  ...
  • command: the command to execute to run the experiment.
    • {args} will be expanded to --parameter-1=value-1 --parameter-2=value-2
    • {name} will be expanded to the study name
  • storage: the storage to use for the study. A full list of storages is available here
  • sampler: the sampler to use. The full list of samplers is available here
  • pruner: the pruner to use. The full list of pruners is available here
  • direction: can be minimize or maximize (default: minimize)
  • replicas: the number of replicas to run for the same experiment (the experiment result is the average). (default: 1)
  • parameters: the parameters to optimize
    • for each parameter have to specify
      • distribution the distribution to use. The full list of distributions is available here (all the suggest_* functions)
      • arg: value
        • Arguments of the distribution. The arguments documentation is available here

CLI Interface

easyopt offer two CLI commands:

  • create to create a study using the easyopt.yml file or the one specified with --config
  • agent <study_name> to run the agent for <study_name>

LIB Interface

When importing easyopt you can use three functions:

  • easyopt.objective(value) to report the final objective function value of the experiment
  • easyopt.report(value) to report the current objective function value of the experiment (used by the pruner)
  • easyopt.should_prune() it returns True if the pruner thinks that the run should be pruned

Examples

You can find some examples here

Contributions and license

The code is released as Free Software under the GNU/GPLv3 license. Copying, adapting and republishing it is not only allowed but also encouraged.

For any further question feel free to reach me at federico.galatolo@ing.unipi.it or on Telegram @galatolo

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

easyopt-0.3.1.tar.gz (22.2 kB view details)

Uploaded Source

Built Distribution

easyopt-0.3.1-py3-none-any.whl (21.8 kB view details)

Uploaded Python 3

File details

Details for the file easyopt-0.3.1.tar.gz.

File metadata

  • Download URL: easyopt-0.3.1.tar.gz
  • Upload date:
  • Size: 22.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for easyopt-0.3.1.tar.gz
Algorithm Hash digest
SHA256 823c158e8ab1b59830d62255a5f00dfc8bd2f6158a100dc09482e7470b139ece
MD5 287726919447bc0e9cbea93adaed2f92
BLAKE2b-256 4e9f6136b189300d503163209e6970d7745d9c397daf2494958c9e17da09ac37

See more details on using hashes here.

File details

Details for the file easyopt-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: easyopt-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 21.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for easyopt-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f1d4816a58f2aadb00cf6ab977fa98990f06d0a65bb87a6cdfcd92b2ff78b3d9
MD5 b81fe46f43b1b95b2c418bc7b91139f4
BLAKE2b-256 105dba1bbf7297a6afaf7ac2261456a709a7352483671f45c8682c0a699b0934

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page