Skip to main content

Lightweight Decision Tree Framework Supporting GBM, Random Forest and Adaboost

Project description

ChefBoost

Downloads Stars License Support me on Patreon DOI

ChefBoost is a lightweight decision tree framework for Python with categorical feature support. It covers regular decision tree algorithms: ID3, C4.5, CART, CHAID and regression tree; also some advanved techniques: gradient boosting, random forest and adaboost. You just need to write a few lines of code to build decision trees with Chefboost.

Installation - Demo

The easiest way to install ChefBoost framework is to download it from from PyPI. It's going to install the library itself and its prerequisites as well.

pip install chefboost

Then, you will be able to import the library and use its functionalities

from chefboost import Chefboost as chef

Usage - Demo

Basically, you just need to pass the dataset as pandas data frame and the optional tree configurations as illustrated below.

import pandas as pd

df = pd.read_csv("dataset/golf.txt")
config = {'algorithm': 'C4.5'}
model = chef.fit(df, config = config, target_label = 'Decision')

Pre-processing

Chefboost handles the both numeric and nominal features and target values in contrast to its alternatives. So, you don't have to apply any pre-processing to build trees.

Outcomes

Built decision trees are stored as python if statements in the tests/outputs/rules directory. A sample of decision rules is demonstrated below.

def findDecision(Outlook, Temperature, Humidity, Wind):
   if Outlook == 'Rain':
      if Wind == 'Weak':
         return 'Yes'
      elif Wind == 'Strong':
         return 'No'
      else:
         return 'No'
   elif Outlook == 'Sunny':
      if Humidity == 'High':
         return 'No'
      elif Humidity == 'Normal':
         return 'Yes'
      else:
         return 'Yes'
   elif Outlook == 'Overcast':
      return 'Yes'
   else:
      return 'Yes'

Testing for custom instances

Decision rules will be stored in outputs/rules/ folder when you build decision trees. You can run the built decision tree for new instances as illustrated below.

prediction = chef.predict(model, param = ['Sunny', 'Hot', 'High', 'Weak'])

You can consume built decision trees directly as well. In this way, you can restore already built decision trees and skip learning steps, or apply transfer learning. Loaded trees offer you findDecision method to test for new instances.

moduleName = "outputs/rules/rules" #this will load outputs/rules/rules.py
tree = chef.restoreTree(moduleName)
prediction = tree.findDecision(['Sunny', 'Hot', 'High', 'Weak'])

tests/global-unit-test.py will guide you how to build a different decision trees and make predictions.

Model save and restoration

You can save your trained models. This makes your model ready for transfer learning.

chef.save_model(model, "model.pkl")

In this way, you can use the same model later to just make predictions. This skips the training steps. Restoration requires to store .py and .pkl files under outputs/rules.

model = chef.load_model("model.pkl")
prediction = chef.predict(model, ['Sunny',85,85,'Weak'])

Sample configurations

ChefBoost supports several decision tree, bagging and boosting algorithms. You just need to pass the configuration to use different algorithms.

Regular Decision Trees

Regular decision tree algorithms find the best feature and the best split point maximizing the information gain. It builds decision trees recursively in child nodes.

config = {'algorithm': 'C4.5'} #Set algorithm to ID3, C4.5, CART, CHAID or Regression
model = chef.fit(df, config)

The following regular decision tree algorithms are wrapped in the library.

Algorithm Metric Tutorial Demo
ID3 Entropy, Information Gain Tutorial Demo
C4.5 Entropy, Gain Ratio Tutorial Demo
CART GINI Tutorial Demo
CHAID Chi Square Tutorial Demo
Regression Standard Deviation Tutorial Demo

Gradient Boosting Tutorial, Demo

Gradient boosting is basically based on building a tree, and then building another based on the previous one's error. In this way, it boosts results. Predictions will be the sum of each tree'e prediction result.

config = {'enableGBM': True, 'epochs': 7, 'learning_rate': 1, 'max_depth': 5}

Random Forest Tutorial, Demo

Random forest basically splits the data set into several sub data sets and builds different data set for those sub data sets. Predictions will be the average of each tree's prediction result.

config = {'enableRandomForest': True, 'num_of_trees': 5}

Adaboost Tutorial, Demo

Adaboost applies a decision stump instead of a decision tree. This is a weak classifier and aims to get min 50% score. It then increases the unclassified ones and decreases the classified ones. In this way, it aims to have a high score with weak classifiers.

config = {'enableAdaboost': True, 'num_of_weak_classifier': 4}

Feature Importance - Demo

Decision trees are naturally interpretable and explainable algorithms. A decision is clear made by a single tree. Still we need some extra layers to understand the built models. Besides, random forest and GBM are hard to explain. Herein, feature importance is one of the most common way to see the big picture and understand built models.

df = chef.feature_importance("outputs/rules/rules.py")
feature final_importance
Humidity 0.3688
Wind 0.3688
Outlook 0.2624
Temperature 0.0000

Paralellism

ChefBoost offers parallelism to speed model building up. Branches of a decision tree will be created in parallel in this way. You should set enableParallelism argument to True in the configuration. Its default value is False. It allocates half of the total number of cores in your environment if parallelism is enabled.

if __name__ == '__main__':
   config = {'algorithm': 'C4.5', 'enableParallelism': True, 'num_cores': 2}
   model = chef.fit(df, config)

Notice that you have to locate training step in an if block and it should check you are in main.

Contributing

Pull requests are welcome. You should run the unit tests locally by running test/global-unit-test.py. Please share the unit test result logs in the PR.

Support

There are many ways to support a project - starring⭐️ the GitHub repos is just one 🙏

You can also support this work on Patreon

Citation

Please cite ChefBoost in your publications if it helps your research. Here is an example BibTeX entry:

@misc{serengil2021chefboost,
  author       = {Serengil, Sefik Ilkin},
  title        = {ChefBoost: A Lightweight Boosted Decision Tree Framework},
  month        = oct,
  year         = 2021,
  publisher    = {Zenodo},
  doi          = {10.5281/zenodo.5576203},
  howpublished = {https://doi.org/10.5281/zenodo.5576203}
}

Also, if you use chefboost in your GitHub projects, please add chefboost in the requirements.txt.

Licence

ChefBoost is licensed under the MIT License - see LICENSE for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chefboost-0.0.16.tar.gz (26.3 kB view details)

Uploaded Source

Built Distribution

chefboost-0.0.16-py3-none-any.whl (26.4 kB view details)

Uploaded Python 3

File details

Details for the file chefboost-0.0.16.tar.gz.

File metadata

  • Download URL: chefboost-0.0.16.tar.gz
  • Upload date:
  • Size: 26.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.5

File hashes

Hashes for chefboost-0.0.16.tar.gz
Algorithm Hash digest
SHA256 e350dd1feb54b4e48b9d40609830fcac6355b5393071b6f0c2dd3096f4a5afbb
MD5 59806456fc1a946b2879d06d40b30224
BLAKE2b-256 6903636d0b85079ca840c98eb59032bc0b9066114d1be52cf4b276aadae7c84a

See more details on using hashes here.

File details

Details for the file chefboost-0.0.16-py3-none-any.whl.

File metadata

  • Download URL: chefboost-0.0.16-py3-none-any.whl
  • Upload date:
  • Size: 26.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.5

File hashes

Hashes for chefboost-0.0.16-py3-none-any.whl
Algorithm Hash digest
SHA256 0a2d0ff9b65e0e05a1c34b470846c888c3c935c13189d1f235c21dceee109fff
MD5 cc6b3f1ce178ee7d98e48cd169f32800
BLAKE2b-256 bd8b7ad6e7b961328e5c16b6274c533c63306ae0dad42b6f3b75fd475bd31a11

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page