Automated XGBoost tunning
Project description
XGBTune is a library for automated XGBoost model tunning. Tunning an XGBoost model is as simple as a single function call.
Get Started
from xgbtune import tune_xgb_model
params, round_count = tune_xgb_model(params, x_train, y_train)
Install
XGBTune is available on PyPi and can be installed with pip:
pip install xgbtune
Tunning steps
The tunning is done in the following steps:
compute best round
tune max_depth and min_child_weight
tune gamma
re-compute best round
tune subsample and colsample_bytree
fine tune subsample and colsample_bytree
tune alpha and lambda
tune seed
This steps can be repeated several times. By default, two passes are done.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
xgbtune-1.0.1.tar.gz
(4.9 kB
view details)
File details
Details for the file xgbtune-1.0.1.tar.gz
.
File metadata
- Download URL: xgbtune-1.0.1.tar.gz
- Upload date:
- Size: 4.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.8.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fdbf3e6c28bc0bdb20aff678cc104f5830eca4f4ee130eacaf1c3fb026a78ab9 |
|
MD5 | cbba5965274d26c18758b71fa1baadce |
|
BLAKE2b-256 | 706152d2734f7477227a2f0af65c73be87fdd4b7f7800b1a6b69063ea7968f90 |