Skip to main content

Conformal hyperparameter optimization tool

Project description

ConfOpt

License arXiv

ConfOpt is an inferential hyperparameter optimization package designed to speed up model hyperparameter tuning.

The package currently implements Adaptive Conformal Hyperparameter Optimization (ACHO), as detailed in the original paper.

Installation

You can install ConfOpt from PyPI using pip:

pip install confopt

Getting Started

As an example, we'll tune a Random Forest model with data from a regression task.

Start by setting up your training and validation data:

from sklearn.datasets import fetch_california_housing

X, y = fetch_california_housing(return_X_y=True)
split_idx = int(len(X) * 0.5)
X_train, y_train = X[:split_idx, :], y[:split_idx]
X_val, y_val = X[split_idx:, :], y[split_idx:]

Then import the Random Forest model to tune and define a search space for its parameters (must be a dictionary mapping the model's parameter names to possible values of that parameter to search):

from sklearn.ensemble import RandomForestRegressor

parameter_search_space = {
    "n_estimators": [10, 30, 50, 100, 150, 200, 300, 400],
    "min_samples_split": [0.005, 0.01, 0.1, 0.2, 0.3],
    "min_samples_leaf": [0.005, 0.01, 0.1, 0.2, 0.3],
    "max_features": [None, 0.8, 0.9, 1],
}

Now import the ConformalSearcher class and initialize it with:

  • The model to tune.
  • The raw X and y data.
  • The parameter search space.
  • An extra variable clarifying whether this is a regression or classification problem.

Hyperparameter tuning can be kicked off with the search method and a specification of how long the tuning should run for (in seconds):

from confopt.tuning import ConformalSearcher

searcher = ConformalSearcher(
    model=RandomForestRegressor(),
    X_train=X_train,
    y_train=y_train,
    X_val=X_val,
    y_val=y_val,
    search_space=parameter_search_space,
    prediction_type="regression",
)

searcher.search(
    runtime_budget=120  # How many seconds to run the search for
)

Once done, you can retrieve the best parameters obtained during tuning using:

searcher.get_best_params()

Or obtain an already initialized model with:

searcher.get_best_model()

More information on specific parameters and overrides not mentioned in this walk-through can be found in the docstrings or in the examples folder of the main repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

confopt-0.0.2.tar.gz (29.6 kB view details)

Uploaded Source

Built Distribution

confopt-0.0.2-py3-none-any.whl (26.0 kB view details)

Uploaded Python 3

File details

Details for the file confopt-0.0.2.tar.gz.

File metadata

  • Download URL: confopt-0.0.2.tar.gz
  • Upload date:
  • Size: 29.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for confopt-0.0.2.tar.gz
Algorithm Hash digest
SHA256 847ae2f625366f48a00808d445eff0d4c94abbcfab2048f6a366f7d7637a511c
MD5 7b52f6181c6e870c5aad1c07c5cad2cb
BLAKE2b-256 4368bee07a41f03285f451673032c611a36bc411f81f015e859d95f66db70c2d

See more details on using hashes here.

File details

Details for the file confopt-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: confopt-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 26.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for confopt-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 6ebf0f7b0658120c6aad6230ccd8442b41ff212a4ac08de8af32894006fc80ce
MD5 86b7f36bae6b609fe2951da94c74018c
BLAKE2b-256 e17ace24d98d71c4bbe6846478aecaf75c385d6a21907a4bbff0fcaaf18e3940

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page