Implements Wide Boosting functions for popular boosting packages
Project description
wideboost
Implements wide boosting using popular boosting frameworks as a backend.
Getting started
pip install wideboost
Sample script
XGBoost back-end
import xgboost as xgb
from wideboost.wrappers import wxgb
dtrain = xgb.DMatrix('../../xgboost/demo/data/agaricus.txt.train')
dtest = xgb.DMatrix('../../xgboost/demo/data/agaricus.txt.test')
# Two extra parameters, 'btype' and 'extra_dims'
param = {'btype':'I','extra_dims':2,'max_depth':2, 'eta':0.1, 'objective':'binary:logistic','eval_metric':['error'] }
num_round = 50
watchlist = [(dtrain,'train'),(dtest,'test')]
wxgb_results = dict()
bst = wxgb.train(param, dtrain, num_round,watchlist,evals_result=xgb_results)
Parameter Explanations
'btype'
indicates how to initialize the beta matrix. Settings are 'I'
, 'In'
, 'R'
, 'Rn'
.
'extra_dims'
integer indicating how many "wide" dimensions are used. When 'extra_dims'
is set to 0
(and 'btype'
is set to 'I'
) then wide boosting is equivalent to standard gradient boosting.
Reference
Coming Soon!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
wideboost-0.0.1.tar.gz
(2.0 kB
view hashes)
Built Distribution
Close
Hashes for wideboost-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 713ea84351c0a8d74d3947ae2c15f48d02913f3017b62be4c4f9f5d8c2f00580 |
|
MD5 | 0fea231f3ddee53e70642109b27f2f04 |
|
BLAKE2b-256 | d08afdfe81feaa8c9a7a1025b991cef3c4c09ff29cbba1761de78a2ce571f0da |