Skip to main content

hgboost is a python package for hyperparameter optimization for xgboost, catboost and lightboost for both classification and regression tasks.

Project description

hgboost - Hyperoptimized Gradient Boosting

Python PyPI Version License Github Forks GitHub Open Issues Project Status Downloads Downloads DOI Sphinx Open In Colab Medium


hgboost is short for Hyperoptimized Gradient Boosting and is a python package for hyperparameter optimization for xgboost, catboost and lightboost using cross-validation, and evaluating the results on an independent validation set. hgboost can be applied for classification and regression tasks.

hgboost is fun because:

* 1. Hyperoptimization of the Parameter-space using bayesian approach.
* 2. Determines the best scoring model(s) using k-fold cross validation.
* 3. Evaluates best model on independent evaluation set.
* 4. Fit model on entire input-data using the best model.
* 5. Works for classification and regression
* 6. Creating a super-hyperoptimized model by an ensemble of all individual optimized models.
* 7. Return model, space and test/evaluation results.
* 8. Makes insightful plots.

⭐️ Star this repo if you like it ⭐️


Blogs

Medium Blog 1: The Best Boosting Model using Bayesian Hyperparameter Tuning but without Overfitting.

Medium Blog 2: Create Explainable Gradient Boosting Classification models using Bayesian Hyperparameter Optimization.


Documentation pages

On the documentation pages you can find detailed information about the working of the hgboost with many examples.


Colab Notebooks

  • Open regression example In Colab Regression example

  • Open classification example In Colab Classification example


Schematic overview of hgboost

Installation Environment

conda create -n env_hgboost python=3.8
conda activate env_hgboost

Install from pypi

pip install hgboost
pip install -U hgboost # Force update

Import hgboost package

import hgboost as hgboost

Examples

Classification example for xgboost, catboost and lightboost:

# Load library
from hgboost import hgboost

# Initialization
hgb = hgboost(max_eval=10, threshold=0.5, cv=5, test_size=0.2, val_size=0.2, top_cv_evals=10, random_state=42)

# Fit xgboost by hyperoptimization and cross-validation
results = hgb.xgboost(X, y, pos_label='survived')

# [hgboost] >Start hgboost classification..
# [hgboost] >Collecting xgb_clf parameters.
# [hgboost] >Number of variables in search space is [11], loss function: [auc].
# [hgboost] >method: xgb_clf
# [hgboost] >eval_metric: auc
# [hgboost] >greater_is_better: True
# [hgboost] >pos_label: True
# [hgboost] >Total dataset: (891, 204) 
# [hgboost] >Hyperparameter optimization..
#  100% |----| 500/500 [04:39<05:21,  1.33s/trial, best loss: -0.8800619834710744]
# [hgboost] >Best performing [xgb_clf] model: auc=0.881198
# [hgboost] >5-fold cross validation for the top 10 scoring models, Total nr. tests: 50
# 100%|██████████| 10/10 [00:42<00:00,  4.27s/it]
# [hgboost] >Evalute best [xgb_clf] model on independent validation dataset (179 samples, 20.00%).
# [hgboost] >[auc] on independent validation dataset: -0.832
# [hgboost] >Retrain [xgb_clf] on the entire dataset with the optimal parameters settings.
# Plot the ensemble classification validation results
hgb.plot_validation()


References

* http://hyperopt.github.io/hyperopt/
* https://github.com/dmlc/xgboost
* https://github.com/microsoft/LightGBM
* https://github.com/catboost/catboost

Maintainers

Contribute

  • Contributions are welcome.

Licence See LICENSE for details.

Coffee

  • If you wish to buy me a Coffee for this work, it is very appreciated :)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hgboost-1.1.3.tar.gz (28.6 kB view details)

Uploaded Source

Built Distribution

hgboost-1.1.3-py3-none-any.whl (27.8 kB view details)

Uploaded Python 3

File details

Details for the file hgboost-1.1.3.tar.gz.

File metadata

  • Download URL: hgboost-1.1.3.tar.gz
  • Upload date:
  • Size: 28.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for hgboost-1.1.3.tar.gz
Algorithm Hash digest
SHA256 0357322fd570bc6c6f358f2cf0f20ec17709bb9eedbda7bf6a1f0634545e4a00
MD5 b35d20d98fcabd6f68cb41e363c2608e
BLAKE2b-256 574bf17afebdcad8e116f0c83f14fa53f22342c9480513f7bb020c1ace4e41f2

See more details on using hashes here.

File details

Details for the file hgboost-1.1.3-py3-none-any.whl.

File metadata

  • Download URL: hgboost-1.1.3-py3-none-any.whl
  • Upload date:
  • Size: 27.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for hgboost-1.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 e3fc17daceca2433ecc7cb1750e98b9f9569cb610208aba3467abc48666e4e08
MD5 17f82d714bc257f64fba17ad5e599ec6
BLAKE2b-256 5723b1afac1f4fbe23ca35b88e772a8aff372e9eabbd2ad036b4ff0de293aac3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page