Skip to main content

Toolbox for ensemble learning on class-imbalanced dataset.

Reason this release was yanked:

When using the imbalanced_ensemble.pipeline module (or the classifier that used this module, such as Bagging-based ensembles), you may receive unexpected print messages.

Project description

Imbalanced Ensemble (IMBENS): ensemble learning for class-imbalanced data in Python.

Documentation Status

imbalanced-ensemble (imported as imbalanced_ensemble) is a Python toolbox for quick implementing and deploying ensemble imbalanced learning algorithms. This package aims to provide users with easy-to-use ensemble imbalanced learning (EIL) methods and related utilities, so that everyone can quickly deploy EIL algorithms to their tasks. The EIL methods implemented in this package have unified APIs and are compatible with other popular Python machine-learning packages such as scikit-learn and imbalanced-learn.

Installation documentation, API documentation, and examples can be found on the documentation.

Installation

imbalanced-ensemble requires following dependencies:

You can install imbalanced-ensemble from PyPI by running:

$ pip install imbalanced-ensemble

Or you can install imbalanced-ensemble by clone this repository:

$ git clone https://github.com/ZhiningLiu1998/imbalanced-ensemble.git
$ cd imbalanced-ensemble
$ python setup.py install

Table of Contents

Highlights

  • 🍎 Unified, easy-to-use API design.
    All ensemble learning methods implemented in IMBENS share a unified API design. Similar to sklearn, all methods have functions (e.g., fit(), predict(), predict_proba()) that allow users to deploy them with only a few lines of code.
  • 🍎 Extended functionalities, wider application scenarios.
    All methods in IMBENS are ready for multi-class imbalanced classification. We extend binary ensemble imbalanced learning methods to get them to work under the multi-class scenario. Additionally, for supported methods, we provide more training options like class-wise resampling control, balancing scheduler during the ensemble training process, etc.
  • 🍎 Detailed training log, quick intuitive visualization.
    We provide additional parameters (e.g., eval_datasets, eval_metrics, training_verbose) in fit() for users to control the information they want to monitor during the ensemble training. We also implement an EnsembleVisualizer to quickly visualize the ensemble estimator(s) for providing further information/conducting comparison. See an example here.
  • 🍎 Wide compatiblilty.
    Imbalanced-ensemble (IMBENS) is designed to be compatible with scikit-learn (sklearn) and also other compatible projects like imbalanced-learn. Therefore, users can take advantage of various utilities from the sklearn community for data processing/cross-validation/hyper-parameter tuning, etc.

Background

Class-imbalance (also known as the long-tail problem in multi-class) is the fact that the classes are not represented equally in a classification problem, which is quite common in practice. For instance, fraud detection, prediction of rare adverse drug reactions and prediction gene families. Failure to account for the class imbalance often causes inaccurate and decreased predictive performance of many classification algorithms.

Imbalanced learning (IL) aims to tackle the class imbalance problem to learn an unbiased model from imbalanced data. This is usually achieved by changing the training data distribution by resampling or reweighting. However, naive resampling or reweighting may introduce bias/variance to the training data, especially when the data has class-overlapping or contains noise.

Ensemble imbalanced learning (EIL) is known to effectively improve typical IL solutions by combining the outputs of multiple classifiers, thereby reducing the variance introduce by resampling/reweighting.

List of implemented methods

Currently, 16 ensemble imbalanced learning methods were implemented:

Note: imbalanced-ensemble is still under development.

Usage

Taking self-paced ensemble [1] as an example, it only requires less than 10 lines of code to deploy it:

>>> from imbalanced_ensemble.ensemble import SelfPacedEnsembleClassifier
>>> from sklearn.datasets import make_classification
>>> 
>>> X, y = make_classification(n_samples=1000, n_classes=3,
...                            n_informative=4, weights=[0.2, 0.3, 0.5],
...                            random_state=0)
>>> clf = SelfPacedEnsembleClassifier(random_state=0)
>>> clf.fit(X, y)  
SelfPacedEnsembleClassifier(...)
>>> clf.predict(X)  
array([...])

For more examples, please refer to the documentation.

Acknowledgements

many samplers and utilities are adapted from imbalanced-learn, which is an amazing project!

References

# Reference
[1] Zhining Liu, Wei Cao, Zhifeng Gao, Jiang Bian, Hechang Chen, Yi Chang, and Tie-Yan Liu. 2019. Self-paced Ensemble for Highly Imbalanced Massive Data Classification. 2020 IEEE 36th International Conference on Data Engineering (ICDE). IEEE, 2020, pp. 841-852.
[2] X.-Y. Liu, J. Wu, and Z.-H. Zhou, Exploratory undersampling for class-imbalance learning. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 39, no. 2, pp. 539–550, 2009.
[3] Chen, Chao, Andy Liaw, and Leo Breiman. “Using random forest to learn imbalanced data.” University of California, Berkeley 110 (2004): 1-12.
[4] C. Seiffert, T. M. Khoshgoftaar, J. Van Hulse, and A. Napolitano, Rusboost: A hybrid approach to alleviating class imbalance. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, vol. 40, no. 1, pp. 185–197, 2010.
[5] Maclin, R., & Opitz, D. (1997). An empirical evaluation of bagging and boosting. AAAI/IAAI, 1997, 546-551.
[6] N. V. Chawla, A. Lazarevic, L. O. Hall, and K. W. Bowyer, Smoteboost: Improving prediction of the minority class in boosting. in European conference on principles of data mining and knowledge discovery. Springer, 2003, pp. 107–119
[7] S. Wang and X. Yao, Diversity analysis on imbalanced data sets by using ensemble models. in 2009 IEEE Symposium on Computational Intelligence and Data Mining. IEEE, 2009, pp. 324–331.
[8] Fan, W., Stolfo, S. J., Zhang, J., & Chan, P. K. (1999, June). AdaCost: misclassification cost-sensitive boosting. In Icml (Vol. 99, pp. 97-105).
[9] Shawe-Taylor, G. K. J., & Karakoulas, G. (1999). Optimizing classifiers for imbalanced training sets. Advances in neural information processing systems, 11(11), 253.
[10] Viola, P., & Jones, M. (2001). Fast and robust classification using asymmetric adaboost and a detector cascade. Advances in Neural Information Processing System, 14.
[11] Freund, Y., & Schapire, R. E. (1997). A decision-theoretic generalization of on-line learning and an application to boosting. Journal of computer and system sciences, 55(1), 119-139.
[12] Breiman, L. (1996). Bagging predictors. Machine learning, 24(2), 123-140.
[13] Guillaume Lemaître, Fernando Nogueira, and Christos K. Aridas. Imbalanced-learn: A python toolbox to tackle the curse of imbalanced datasets in machine learning. Journal of Machine Learning Research, 18(17):1–5, 2017.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

imbalanced-ensemble-0.1.0.tar.gz (13.0 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

imbalanced_ensemble-0.1.0-py3.8.egg (578.5 kB view details)

Uploaded Egg

imbalanced_ensemble-0.1.0-py2.py3-none-any.whl (256.3 kB view details)

Uploaded Python 2Python 3

File details

Details for the file imbalanced-ensemble-0.1.0.tar.gz.

File metadata

  • Download URL: imbalanced-ensemble-0.1.0.tar.gz
  • Upload date:
  • Size: 13.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.1.post20201107 requests-toolbelt/0.9.1 tqdm/4.50.2 CPython/3.8.5

File hashes

Hashes for imbalanced-ensemble-0.1.0.tar.gz
Algorithm Hash digest
SHA256 c701a4e07e3b653c4ae13a0e70c83cba7eb44024850a19555d95e722fda91203
MD5 f1eb12d50a0eaf60821bf22e9de813b0
BLAKE2b-256 61d2c6ab9967cec43158273ca7731934e015cff184def6fc072888721cbdc3f3

See more details on using hashes here.

File details

Details for the file imbalanced_ensemble-0.1.0-py3.8.egg.

File metadata

  • Download URL: imbalanced_ensemble-0.1.0-py3.8.egg
  • Upload date:
  • Size: 578.5 kB
  • Tags: Egg
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.1.post20201107 requests-toolbelt/0.9.1 tqdm/4.50.2 CPython/3.8.5

File hashes

Hashes for imbalanced_ensemble-0.1.0-py3.8.egg
Algorithm Hash digest
SHA256 bfaf61b9a89cfc5d693250352471285811e50fa352114e9d4d5e69e531937afc
MD5 0fbdc73cdcb7ef0f216025b69318b9ca
BLAKE2b-256 725bdc6e6ecd993987e4dde80b5055cf4f2b57f06b3faf427229760fcddd9324

See more details on using hashes here.

File details

Details for the file imbalanced_ensemble-0.1.0-py2.py3-none-any.whl.

File metadata

  • Download URL: imbalanced_ensemble-0.1.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 256.3 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.1.post20201107 requests-toolbelt/0.9.1 tqdm/4.50.2 CPython/3.8.5

File hashes

Hashes for imbalanced_ensemble-0.1.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 c6cb3153077efdc884bcbca8638c5fb337c1d3484a0a12059f106035796b3d5f
MD5 4b9a2d8d93a10795358b61b54d424b2b
BLAKE2b-256 6de0f1abc2bd8cd6ffba51af7d9b80a38932dc67bfd339a39c205724b7ed2181

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page