Automated XGBoost tunning
Project description
XGBTune is a library for automated XGBoost model tuning. Tuning an XGBoost model is as simple as a single function call.
Get Started
from xgbtune import tune_xgb_model
params, round_count = tune_xgb_model(params, x_train, y_train)
Install
XGBTune is available on PyPi and can be installed with pip:
pip install xgbtune
Tuning steps
The tuning is done in the following steps:
compute best round
tune max_depth and min_child_weight
tune gamma
re-compute best round
tune subsample and colsample_bytree
fine tune subsample and colsample_bytree
tune alpha and lambda
tune seed
This steps can be repeated several times. By default, two passes are done.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
xgbtune-1.1.0.tar.gz
(5.0 kB
view details)
File details
Details for the file xgbtune-1.1.0.tar.gz
.
File metadata
- Download URL: xgbtune-1.1.0.tar.gz
- Upload date:
- Size: 5.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.8.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 52fe40be57b5551c559bad48070c2628384552f275db42ccaf75d2e8ebe3a32d |
|
MD5 | 34212c591d9fea1c26ba08460e695a59 |
|
BLAKE2b-256 | f20396a050eaf317a460098ccc3044f47411d501f3230ec980db003b66930892 |