Bagging Classifier with Under Sampling
Project description
# USBaggingClassifier # Overview Bagging Classifier with Under Sampling. This approach is good for classification imbalanced data. You can use both of Binary or Multi-Class Classification. Methods could use looks like sci-kit learn’s APIs. Only use in python 3.x # Usage ## Parameters * base_estimator : object Classifier looks like sklearn.XXClassifier. Classifier must have methods [fit(X, y), predict(X)]. It is not nesessary predict_proba(X), but if it has this method, you could select ‘soft voting’ option and get predict probability. * n_estimators : int (default=10) The number of base estimators. * voting : str {‘hard’,’soft’} (default=’hard’) hard : use majority rule voting soft : argmax of the sums of prediction probabilities * n_jobs : int (default=1) number of jobs to run in parallel for fit. If -1, equals to number of cores. ## methods * fit(X, y) X : pandas.DataFrame y : pandas.Series return : None * predict(X) X : pandas.DataFrame return : predicted y : numpy.array * predict_proba(X) X : pandas.DataFrame return : predicted probabilities (mean of all bagged models)
# LICENSE This software is released under the MIT License, see [LICENSE](/LICENSE)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for usbclassifier-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bbf59f4a7e85399379c46945369ce9eff434dd56c8197762caa33d9330821455 |
|
MD5 | 4c5c41b039f2ed3256454ac57b4783b8 |
|
BLAKE2b-256 | fc5dd08a916ed65f1a4142c71a8cef4b95b9e8e8860d3129c31815a957f38398 |