Lightweight Decision Tree Framework Supporting GBM, Random Forest and Adaboost
Project description
chefboost
Chefboost is a lightweight gradient boosting, random forest and adaboost enabled decision tree framework including regular ID3, C4.5, CART, CHAID and regression tree algorithms with categorical features support. It is lightweight, you just need to write a few lines of code to build decision trees with Chefboost.
Installation
The easiest way to install Chefboost framework is to download it from from PyPI.
pip install chefboost
Installation guideline is captured as a video here.
Usage
Basically, you just need to pass the dataset as pandas data frame and tree configurations after importing Chefboost as illustrated below. You just need to put the target label to the right. Besides, chefboost handles both numeric and nominal features and target values in contrast to its alternatives.
from chefboost import Chefboost as chef
import pandas as pd
df = pd.read_csv("dataset/golf.txt")
config = {'algorithm': 'ID3'}
model = chef.fit(df, config)
Outcomes
Built decision trees are stored as python if statements in the tests/outputs/rules
directory. A sample of decision rules is demonstrated below.
def findDecision(Outlook, Temperature, Humidity, Wind, Decision):
if Outlook == 'Rain':
if Wind == 'Weak':
return 'Yes'
elif Wind == 'Strong':
return 'No'
else:
return 'No'
elif Outlook == 'Sunny':
if Humidity == 'High':
return 'No'
elif Humidity == 'Normal':
return 'Yes'
else:
return 'Yes'
elif Outlook == 'Overcast':
return 'Yes'
else:
return 'Yes'
Testing for custom instances
Decision rules will be stored in outputs/rules/
folder when you build decision trees. You can run the built decision tree for new instances as illustrated below.
test_instance = ['Sunny', 'Hot', 'High', 'Weak']
model = chef.fit(df, config)
prediction = chef.predict(model, test_instance)
You can consume built decision trees directly as well. In this way, you can restore already built decision trees and skip learning steps, or apply transfer learning. Loaded trees offer you findDecision method to test for new instances.
moduleName = "outputs/rules/rules" #this will load outputs/rules/rules.py
tree = chef.restoreTree(moduleName)
prediction = tree.findDecision(['Sunny', 'Hot', 'High', 'Weak'])
tests/global-unit-test.py will guide you how to build a different decision trees and make predictions.
Model save and restoration
You can save your trained models.
model = chef.fit(df.copy(), config)
chef.save_model(model, "model.pkl")
In this way, you can use the same model later to just make predictions. This skips the training steps. Restoration requires to store .py and .pkl files under outputs/rules
.
model = chef.load_model("model.pkl")
prediction = chef.predict(model, ['Sunny',85,85,'Weak'])
Sample configurations
Chefboost supports several decision tree, bagging and boosting algorithms. You just need to pass the configuration to use different algorithms.
Regular Decision Trees ID3 Video
, C4.5 Video
, CART Video
, CHAID Video
, Regression Tree Video
config = {'algorithm': 'C4.5'} #ID3, C4.5, CART, CHAID or Regression
Gradient Boosting Video
config = {'enableGBM': True, 'epochs': 7, 'learning_rate': 1}
Random Forest Video
config = {'enableRandomForest': True, 'num_of_trees': 5}
Adaboost Video
config = {'enableAdaboost': True, 'num_of_weak_classifier': 4}
Feature Importance - Video
Decision trees are inherently interpretable and explainable algorithms. Still we can add some extra layers to explain the built models. Herein, feature importance is one of the most common way to make transparent models.
if __name__ == '__main__':
config = {'algorithm': 'C4.5', 'enableParallelism': True}
model = chef.fit(df, config)
fi = chef.feature_importance()
print(fi)
This returns feature importance values in the pandas data frame format.
feature | final_importance |
---|---|
Wind | 0.609868 |
Humidity | 0.265105 |
Temperature | 0.197528 |
Outlook | -0.072501 |
Paralellism
Chefboost offers parallelism to speed model building up. Branches of a decision tree will be created in parallel in this way. You should pass enableParallelism argument as True in the configuration. Its default value is False.
if __name__ == '__main__':
config = {'algorithm': 'C4.5', 'enableParallelism': True}
model = chef.fit(df, config)
Notice that you have to locate training step in an if block and it should check you are in main.
E-Learning
This playlist guides you how to use Chefboost step by step for different algorithms.
You can also find the tutorials about these core algorithms here.
Besides, you can enroll this online course - Decision Trees for Machine Learning From Scratch and follow the curriculum if you wonder the theory of decision trees and how this framework is developed.
Support
There are many ways to support a project - starring⭐️ the GitHub repos is just one.
Citation
Please cite chefboost in your publications if it helps your research. Here is an example BibTeX entry:
@misc{serengil2019chefboost,
abstract = {Lightweight Decision Trees Framework supporting Gradient Boosting (GBDT, GBRT, GBM), Random Forest and Adaboost w/categorical features support for Python},
author={Serengil, Sefik Ilkin},
title={chefboost},
url={https://github.com/serengil/chefboost}
year={2019}
}
Licence
Chefboost is licensed under the MIT License - see LICENSE
for more details.
Chefboost logo is created by Tang Ge. Licensed under Creative Commons: By Attribution 3.0 License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file chefboost-0.0.7.tar.gz
.
File metadata
- Download URL: chefboost-0.0.7.tar.gz
- Upload date:
- Size: 19.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/45.1.0 requests-toolbelt/0.9.1 tqdm/4.42.0 CPython/3.6.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 03117dab5ef8db31d33d22770d823bb94182e262d98cfc242137ce3b829a96c8 |
|
MD5 | 57ecbb0faeb98cc182662f7d280ffa46 |
|
BLAKE2b-256 | 6752259633a13e100bea664a567f43917c0dcb156b3cc41b399f37911bc6f78d |
File details
Details for the file chefboost-0.0.7-py3-none-any.whl
.
File metadata
- Download URL: chefboost-0.0.7-py3-none-any.whl
- Upload date:
- Size: 24.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/45.1.0 requests-toolbelt/0.9.1 tqdm/4.42.0 CPython/3.6.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e50723fbd201a9adf9fa799ce7ebae395f5c6d66503ec7b001acccae6d4e0dd6 |
|
MD5 | 96526bab8d4e6ecebe00ef405841f55c |
|
BLAKE2b-256 | 2ef5ad0b2dfb02d973330e553f0767d052dd7e6f9cddabbfa4f3ee06646f42ad |