Skip to main content

Lightweight Decision Tree Framework

Project description

chefboost

Chefboost is a lightweight gradient boosting, random forest and adaboost enabled decision tree framework including regular ID3, C4.5, CART and regression tree algorithms with categorical features support. It is lightweight, you just need to write a few lines of code to build decision trees with Chefboost.

Usage

Basically, you just need to pass the dataset as pandas data frame and tree configurations after importing Chefboost as illustrated below. You just need to put the target label to the right. Besides, chefboost handles both numeric and nominal features and target values in contrast to its alternatives.

import Chefboost as chef
import pandas as pd

df = pd.read_csv("dataset/golf.txt")

config = {'algorithm': 'ID3'}
model = chef.fit(df, config)

Outcomes

Built decision trees are stored as python if statements in the outputs/rules/rules.py file. A sample of decision rules is demonstrated below.

def findDecision(Outlook,Temperature,Humidity,Wind,Decision):
   if Outlook == 'Rain':
      if Wind == 'Weak':
         return 'Yes'
      elif Wind == 'Strong':
         return 'No'
   elif Outlook == 'Sunny':
      if Humidity == 'High':
         return 'No'
      elif Humidity == 'Normal':
         return 'Yes'
   elif Outlook == 'Overcast':
      return 'Yes'

Testing for custom instances

Decision rules will be stored in outputs/rules/ folder when you build decision trees. You can run the built decision tree for new instances as illustrated below.

test_instance = ['Sunny', 'Hot', 'High', 'Weak']
model = chef.fit(df, config)
prediction = chef.predict(model, test_instance)

You can consume built decision trees directly as well. In this way, you can restore already built decision trees and skip learning steps, or apply transfer learning. Loaded trees offer you findDecision method to test for new instances.

from commons import functions
moduleName = "outputs/rules/rules" #this will load outputs/rules/rules.py
tree = functions.restoreTree(moduleName)
prediction = tree.findDecision(['Sunny', 'Hot', 'High', 'Weak'])

Dispathcher.py will guide you how to build a different decision trees and make predictions.

Model save and restoration

You can save your trained models.

model = chef.fit(df.copy(), config)
chef.save_model(model, "model.pkl")

In this way, you can use the same model later to just make predictions. This skips the training steps. Restoration requires to store .py and .pkl files under outputs/rules.

model = chef.load_model("model.pkl")
prediction = chef.predict(model, ['Sunny',85,85,'Weak'])

Sample configurations

Chefboost supports several decision tree, bagging and boosting algorithms. You just need to pass the configuration to use different algorithms.

Regular Decision Trees ID3 Video, C4.5 Video, CART Video, Regression Tree Video

config = {'algorithm': 'C4.5'} #ID3, C4.5, CART or Regression

Gradient Boosting Video

config = {'enableGBM': True, 'epochs': 7, 'learning_rate': 1}

Random Forest Video

config = {'enableRandomForest': True, 'num_of_trees': 5}

Adaboost Video

config = {'enableAdaboost': True, 'num_of_weak_classifier': 4}

Prerequisites

Pandas and numpy python libraries are used to load data sets in this repository. You might run the following commands to install these packages if you are going to use them first time.

pip install pandas==0.22.0
pip install numpy==1.14.0
pip install tqdm==4.30.0

Initial tests are run on the following environment.

C:\>python --version
Python 3.6.4 :: Anaconda, Inc.

Installation

You can run the following command in the command prompt to install Chefboost. If git is not recognized as command in your environment you can download this link.

git clone https://github.com/serengil/chefboost.git

Then, you should create your notebook in the same directory with Chefboost.py.

Documentation

This YouTube playlist guides you how to use Chefboost step by step for different algorithms. You can also find the detailed documentations about these core algorithms here.

Besides, you can enroll this online course - Decision Trees for Machine Learning From Scratch and follow the curriculum if you wonder the theory of decision trees and how this framework is developed.

Support

There are many ways to support a project - starring the GitHub repos is one.

Licence

Chefboost is licensed under the MIT License - see LICENSE for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chefboost-0.0.1.tar.gz (13.0 kB view details)

Uploaded Source

Built Distribution

chefboost-0.0.1-py3-none-any.whl (17.9 kB view details)

Uploaded Python 3

File details

Details for the file chefboost-0.0.1.tar.gz.

File metadata

  • Download URL: chefboost-0.0.1.tar.gz
  • Upload date:
  • Size: 13.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/44.0.0 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.6.4

File hashes

Hashes for chefboost-0.0.1.tar.gz
Algorithm Hash digest
SHA256 7ad218d308f14fbe79592c2383de76f34c3e103cbd0ad3c03d797d3622cc0e2b
MD5 5b2bff167263b0ca7c16191774548f1e
BLAKE2b-256 e1a8cf51eb3fb24f30b9225134aac1a7ac3077112c446ecc24cb8b309c567093

See more details on using hashes here.

File details

Details for the file chefboost-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: chefboost-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 17.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/44.0.0 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.6.4

File hashes

Hashes for chefboost-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c9e9004697ae1529674acc895b6a558e3c42191af8e3545f0dd379e3673f6adc
MD5 f11c7cf2768890d047a8f8da102b96b6
BLAKE2b-256 0394ff185c8f36702b4d95c2b8e23e204774dca8a61183e0d1816c99a4983c45

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page