Skip to main content

A hyperparameter optimization framework

Project description

Optuna: A hyperparameter optimization framework

Python pypi conda GitHub license Read the Docs Codecov

:link: Website | :page_with_curl: Docs | :gear: Install Guide | :pencil: Tutorial | :bulb: Examples | Twitter | LinkedIn | Medium

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.

:loudspeaker: News

Help us create the next version of Optuna!

Optuna 5.0 Roadmap published for review. Please take a look at the planned improvements to Optuna, and share your feedback in the github issues. PR contributions also welcome!

Please take a few minutes to fill in this survey, and let us know how you use Optuna now and what improvements you'd like.🤔

All questions are optional. 🙇‍♂️ https://forms.gle/wVwLCQ9g6st6AXuq9

:fire: Key Features

Optuna has modern functionalities as follows:

Basic Concepts

We use the terms study and trial as follows:

  • Study: optimization based on an objective function
  • Trial: a single execution of the objective function

Please refer to the sample code below. The goal of a study is to find out the optimal set of hyperparameter values (e.g., regressor and svr_c) through multiple trials (e.g., n_trials=100). Optuna is a framework designed for automation and acceleration of optimization studies.

Sample code with scikit-learn

Open in Colab

import ...

# Define an objective function to be minimized.
def objective(trial):

    # Invoke suggest methods of a Trial object to generate hyperparameters.
    regressor_name = trial.suggest_categorical('regressor', ['SVR', 'RandomForest'])
    if regressor_name == 'SVR':
        svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
        regressor_obj = sklearn.svm.SVR(C=svr_c)
    else:
        rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
        regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)

    X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
    X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)

    regressor_obj.fit(X_train, y_train)
    y_pred = regressor_obj.predict(X_val)

    error = sklearn.metrics.mean_squared_error(y_val, y_pred)

    return error  # An objective value linked with the Trial object.

study = optuna.create_study()  # Create a new study.
study.optimize(objective, n_trials=100)  # Invoke optimization of the objective function.

[!NOTE] More examples can be found in optuna/optuna-examples.

The examples cover diverse problem setups such as multi-objective optimization, constrained optimization, pruning, and distributed optimization.

Installation

Optuna is available at the Python Package Index and on Anaconda Cloud.

# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna

[!IMPORTANT] Optuna supports Python 3.8 or newer.

Also, we provide Optuna docker images on DockerHub.

Integrations

Optuna has integration features with various third-party libraries. Integrations can be found in optuna/optuna-integration and the document is available here.

Supported integration libraries

Web Dashboard

Optuna Dashboard is a real-time web dashboard for Optuna. You can check the optimization history, hyperparameter importance, etc. in graphs and tables. You don't need to create a Python script to call Optuna's visualization functions. Feature requests and bug reports are welcome!

optuna-dashboard

optuna-dashboard can be installed via pip:

$ pip install optuna-dashboard

[!TIP] Please check out the convenience of Optuna Dashboard using the sample code below.

Sample code to launch Optuna Dashboard

Save the following code as optimize_toy.py.

import optuna


def objective(trial):
    x1 = trial.suggest_float("x1", -100, 100)
    x2 = trial.suggest_float("x2", -100, 100)
    return x1 ** 2 + 0.01 * x2 ** 2


study = optuna.create_study(storage="sqlite:///db.sqlite3")  # Create a new study with database.
study.optimize(objective, n_trials=100)

Then try the commands below:

# Run the study specified above
$ python optimize_toy.py

# Launch the dashboard based on the storage `sqlite:///db.sqlite3`
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.

OptunaHub

OptunaHub is a feature-sharing platform for Optuna. You can use the registered features and publish your packages.

Use registered features

optunahub can be installed via pip:

$ pip install optunahub
# Install AutoSampler dependencies (CPU only is sufficient for PyTorch)
$ pip install cmaes scipy torch --extra-index-url https://download.pytorch.org/whl/cpu

You can load registered module with optunahub.load_module.

import optuna
import optunahub


def objective(trial: optuna.Trial) -> float:
    x = trial.suggest_float("x", -5, 5)
    y = trial.suggest_float("y", -5, 5)
    return x**2 + y**2


module = optunahub.load_module(package="samplers/auto_sampler")
study = optuna.create_study(sampler=module.AutoSampler())
study.optimize(objective, n_trials=10)

print(study.best_trial.value, study.best_trial.params)

For more details, please refer to the optunahub documentation.

Publish your packages

You can publish your package via optunahub-registry. See the Tutorials for Contributors in OptunaHub.

Communication

Contribution

Any contributions to Optuna are more than welcome!

If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined, and often good starting points for you to get familiar with the contribution workflow and other developers.

If you already have contributed to Optuna, we recommend the other contribution-welcome issues.

For general guidelines on how to contribute to the project, take a look at CONTRIBUTING.md.

Reference

If you use Optuna in one of your research projects, please cite our KDD paper "Optuna: A Next-generation Hyperparameter Optimization Framework":

BibTeX
@inproceedings{akiba2019optuna,
  title={{O}ptuna: A Next-Generation Hyperparameter Optimization Framework},
  author={Akiba, Takuya and Sano, Shotaro and Yanase, Toshihiko and Ohta, Takeru and Koyama, Masanori},
  booktitle={The 25th ACM SIGKDD International Conference on Knowledge Discovery \& Data Mining},
  pages={2623--2631},
  year={2019}
}

License

MIT License (see LICENSE).

Optuna uses the codes from SciPy and fdlibm projects (see LICENSE_THIRD_PARTY).

Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optuna-4.4.0.tar.gz (467.7 kB view details)

Uploaded Source

Built Distribution

optuna-4.4.0-py3-none-any.whl (395.9 kB view details)

Uploaded Python 3

File details

Details for the file optuna-4.4.0.tar.gz.

File metadata

  • Download URL: optuna-4.4.0.tar.gz
  • Upload date:
  • Size: 467.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for optuna-4.4.0.tar.gz
Algorithm Hash digest
SHA256 a9029f6a92a1d6c8494a94e45abd8057823b535c2570819072dbcdc06f1c1da4
MD5 d118bb80cfe5057d7ffd0d981a1b840d
BLAKE2b-256 a5e0b303190ae8032d12f320a24c42af04038bacb1f3b17ede354dd1044a5642

See more details on using hashes here.

File details

Details for the file optuna-4.4.0-py3-none-any.whl.

File metadata

  • Download URL: optuna-4.4.0-py3-none-any.whl
  • Upload date:
  • Size: 395.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for optuna-4.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fad8d9c5d5af993ae1280d6ce140aecc031c514a44c3b639d8c8658a8b7920ea
MD5 ae556180d8794aa4b2f3bedc8233ea55
BLAKE2b-256 5c5e068798a8c7087863e7772e9363a880ab13fe55a5a7ede8ec42fab8a1acbb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page