Skip to main content

Add your description here

Project description

sk-stepwise

Overview

StepwiseHyperoptOptimizer is a custom Python class that combines the power of the Hyperopt optimization library with a stepwise optimization strategy for hyperparameter tuning of machine learning models. It extends the capabilities of scikit-learn's BaseEstimator and MetaEstimatorMixin, making it easy to integrate into existing machine learning workflows.

This class enables you to optimize a model's hyperparameters in a sequential manner, following a predefined series of hyperparameter spaces. Each step in the sequence focuses on refining a specific set of parameters, allowing for a more targeted and efficient optimization process. The hyperparameter optimization uses Tree of Parzen Estimators (TPE) through the Hyperopt library.

Features

  • Stepwise Hyperparameter Tuning: Break down the optimization process into multiple steps, each refining a specific set of hyperparameters.
  • Hyperopt Integration: Utilize Hyperopt's TPE algorithm to find the optimal parameters efficiently.
  • Scikit-learn Compatibility: StepwiseHyperoptOptimizer is compatible with the scikit-learn ecosystem, making it easy to use in scikit-learn pipelines and workflows.
  • Flexible Scoring: Supports both default scikit-learn scoring metrics and custom scoring functions.

Installation

pip install sk-stepwise

Usage

Here's an example of how to use StepwiseHyperoptOptimizer to optimize a scikit-learn model:

>>> import numpy as np
>>> import pandas as pd
>>> from sklearn.ensemble import RandomForestRegressor
>>> from sk_stepwise import StepwiseHyperoptOptimizer
>>> import hyperopt

>>> # Sample data
>>> X = pd.DataFrame(np.random.rand(100, 5), columns=[f"feature_{i}" for i in range(5)])
>>> y = pd.Series(np.random.rand(100))

>>> # Define the model
>>> model = RandomForestRegressor()

>>> # Define the parameter space sequence for stepwise optimization
>>> param_space_sequence = [
...     {"n_estimators": hyperopt.hp.choice("n_estimators", [50, 100, 150])},
...     {"max_depth": hyperopt.hp.quniform("max_depth", 3, 10, 1)},
...     {"min_samples_split": hyperopt.hp.uniform("min_samples_split", 0.1, 1.0)},
... ]

>>> # Create the optimizer
>>> optimizer = StepwiseHyperoptOptimizer(model=model, param_space_sequence=param_space_sequence, max_evals_per_step=50)

>>> # Fit the optimizer
>>> optimizer.fit(X, y)

>>> # Make predictions
>>> predictions = optimizer.predict(X)

Key Methods

  • fit(X, y): Fits the optimizer to the data, performing stepwise hyperparameter optimization.
  • predict(X): Uses the optimized model to make predictions.
  • score(X, y): Evaluates the optimized model on a test set.

Parameters

  • model (_Fitable): A scikit-learn compatible model that implements fit, predict, and set_params methods.
  • param_space_sequence (list[dict]): A list of dictionaries representing the hyperparameter spaces for each optimization step.
  • max_evals_per_step (int): The maximum number of evaluations to perform for each step of the optimization.
  • cv (int): Number of cross-validation folds.
  • scoring (str or Callable): The scoring metric to use for evaluation. Default is "neg_mean_squared_error".
  • random_state (int): Random seed for reproducibility.

Contributing

Contributions are welcome! Feel free to open issues or pull requests for new features, bug fixes, or documentation improvements.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgements

  • Hyperopt for hyperparameter optimization.
  • scikit-learn for model implementation and evaluation utilities.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sk_stepwise-0.1.0.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

sk_stepwise-0.1.0-py3-none-any.whl (4.0 kB view details)

Uploaded Python 3

File details

Details for the file sk_stepwise-0.1.0.tar.gz.

File metadata

  • Download URL: sk_stepwise-0.1.0.tar.gz
  • Upload date:
  • Size: 4.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.5

File hashes

Hashes for sk_stepwise-0.1.0.tar.gz
Algorithm Hash digest
SHA256 7960bdc0e3b95daf55cbea9b36c1adae3f0a98be0a2ef37a35f9ecccb478e87d
MD5 a727637dafd3b7147450986a1581ed45
BLAKE2b-256 2232c4c15443021e942a92969913fad979f6365df69d5df27d34375c114d9bfd

See more details on using hashes here.

File details

Details for the file sk_stepwise-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: sk_stepwise-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 4.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.5

File hashes

Hashes for sk_stepwise-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8ec413654ffd842131007f2e1ec80756c34c08b07af6b93028b0beca416d5388
MD5 dd32b71a510ff81cfe5bbe66de8e4684
BLAKE2b-256 e1ad51c269a7011d4d545e4909819c891a9c82064d797362f74f00048e496bc2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page