Implements Wide Boosting functions for popular boosting packages
Project description
wideboost
Implements wide boosting using popular boosting frameworks as a backend.
Getting started
pip install wideboost
Sample script
XGBoost back-end
import xgboost as xgb
from wideboost.wrappers import wxgb
dtrain = xgb.DMatrix('../../xgboost/demo/data/agaricus.txt.train')
dtest = xgb.DMatrix('../../xgboost/demo/data/agaricus.txt.test')
# Two extra parameters, 'btype' and 'extra_dims'
param = {'btype':'I','extra_dims':2,'max_depth':2, 'eta':0.1, 'objective':'binary:logistic','eval_metric':['error'] }
num_round = 50
watchlist = [(dtrain,'train'),(dtest,'test')]
wxgb_results = dict()
bst = wxgb.train(param, dtrain, num_round,watchlist,evals_result=xgb_results)
Parameter Explanations
'btype'
indicates how to initialize the beta matrix. Settings are 'I'
, 'In'
, 'R'
, 'Rn'
.
'extra_dims'
integer indicating how many "wide" dimensions are used. When 'extra_dims'
is set to 0
(and 'btype'
is set to 'I'
) then wide boosting is equivalent to standard gradient boosting.
Reference
Coming Soon!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
wideboost-0.1.1.tar.gz
(6.8 kB
view hashes)
Built Distribution
Close
Hashes for wideboost-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 962dc6d411a96dcb5fa143753ef14ec8f5a5a6cea22897319614e421dbf46bdc |
|
MD5 | e696d47a2a70c790566b187f312bd892 |
|
BLAKE2b-256 | f8cb9e17c75e20e799495db78b2129af663ddab8667608f1ef99388e3fc69420 |