Skip to main content

A python package to implement Variable Grouping Based Bayesian Additive Regression Tree, associated with ICASSP2020 submitted paper

Project description

GBART

Introduction

GBART is a pure python package to implement our proposed algorithm GBART in our ICASSP2020 submitted paper:

Variable Grouping based Bayesian Additive Regression Tree

Through GBART, We will seek for potential grouping of variables in such way that there is no nonlinear interaction term between variables of different groups, then we build the final model that can take advantage of such concealed information.

GBART consists of two stages. The first stage is to search for potential interactions and an appropriate grouping of variables. The second stage is to build a final model based on the discovered groups.

This python package is built based on BartPy package, the pure python version of BART proposed by Chipman et. al [1]. For details please refer to the offical website BartPy.

Installation

Using pip

pip install gbart
Note: a typo is fixing, somehow using pip will result in a missed subfolder(modified_bart), we recommend install it manually.

Mannually installation

1. Visit our website https://github.com/augusHsu/gbart

2. Directly download folder "gbart"

3. put folders into place where you usually import packages, generally, it should be in 
 ~/bin/python3.7/site-packages/ 
 Or put it with your script in same folder, then it will be fine.

Usage and examples

The easiest way to run and obtain result (generated data)

  • Preparation
import copy
import numpy as np
import gbart.utilities as ut  # provide helper functions
import gbart.create_dataset as cd  # data generator
from gbart.groupbart import * 
  • generate dataset
dataset = cd.create_friedman()
# the last column of dataset is dependent variable, the output Y.
  • Build model and get accuracy by using BART.
acc_o = build_original_model(dataset)
# This function splits the whole dataset into training and testing (80% for training)
# This function returns accuracy in testing data.
  • Get the grouping information
output_pair = get_pair(dataset, theta=1)
# return the grouping information. The first stage of GBART.
# theta is a parameter that controls variable selection. Suggested range is [0,1]
# To turn off the variable selection, set theta=0
# More variables are likely to be removed when theta is higher.

  • Build the gbart model
acc_g = build_group_wise_model(dataset, output_pair)
# take "output_pair" as pair_list 
# This function returns accuracy in testing data.

To design a customization version and/or tune model parameters. Please consider the following

  • Write your own helper function instead of calling functions in groupbart.py, the only thing you may need in groupbart.py is get_pair(dataset), which will help you find the proper grouping information.

  • Tune the parameters of both BART and GBART model. Here is an easy example.

import numpy as np
import gbart.utilities as ut
from gbart.modified_bartpy.sklearnmodel import SklearnModel

# Data preparation 
b = int(0.8 * np.shape(dataset)[0])  
Data_train = dataset[:b,:]
Data_predict = dataset[b:,:]
x_data = Data_train[:,:-1]
y_data = Data_train[:,-1]


# Building the model
model = SklearnModel(sublist=None,
                     n_trees=50,
                     n_chains=4,
                     n_samples=50,
                     n_burn=200,
                     thin=0.1,
                     n_jobs=1)
# This GBART model is built based on BART model, a new feature named sublist is added. 
# sublist can either take "None" or list of groups of variables as input.
# when sublist is None, it will be the exact BART model.
# when sublist is a list, it will build GBART model.
# To tune other parameters, please refer to BartPy usage guide.

# fit and prediction 
model.fit(x_data, y_data)
y_pred = model.predict(Data_predict[:,:-1])
y_true = Data_predict[:,-1]
acc = ut.get_error_reg(y_pred, y_true)

Real data example

  • Preparation
import numpy as np
import pandas as pd
from gbart.groupbart import * 
  • load and pre-processing data
url = "~/Admission_Predict_Ver1.1.csv"
data = pd.read_csv(url)
data = data.drop(columns="Serial No.")
dataset = data.iloc[:, :].values
# Note our package can match with Sklearn, we use the same dataset type. 
  • Build model and get accuracy by using BART.
acc_o = build_original_model(dataset)
# This function splits the whole dataset into training and testing (80% for training)
# This function returns accuracy in testing data.
  • Get the grouping information
output_pair = get_pair(dataset)
# return the the grouping information. The first phase of Gbart algorithm.

  • Build the gbart model
acc_g = build_group_wise_model(dataset, output_pair)
# take "output_pair" as pair_list 
# This function returns accuracy in testing data.

  • Tune the parameters of both BART and GBART model. Please refer to the generated part.

  • One more advcanced thing that worth to explore: During the group searching process, we may need to set some threshold to control the variable selection. I fix it to be 85 percent of the best, you can tune it by change that constant to be an attribute in get_pair(dataset)

Acknowledge

We truly thank Mr. Jake Coltman for his contribution to BartPy package.

Reference

[1] Chipman, Hugh A., Edward I. George, and Robert E. McCulloch. “BART: Bayesian Additive Regression Trees.” The Annals of Applied Statistics 4.1 (2010): 266–298. Crossref. Web.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gbart-0.2.1.tar.gz (29.8 kB view details)

Uploaded Source

Built Distribution

gbart-0.2.1-py3-none-any.whl (40.2 kB view details)

Uploaded Python 3

File details

Details for the file gbart-0.2.1.tar.gz.

File metadata

  • Download URL: gbart-0.2.1.tar.gz
  • Upload date:
  • Size: 29.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for gbart-0.2.1.tar.gz
Algorithm Hash digest
SHA256 3f45f93602fd103ebd601fcafd95a99b8c2746eda8ecc1141946e41db3af460e
MD5 42fd797c87b0a2a80b6bdde11f4b72ac
BLAKE2b-256 97c880983a899056df3bc3b29f5f43f63b9e1f195f8ec5c1e5336e535e0519f9

See more details on using hashes here.

File details

Details for the file gbart-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: gbart-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 40.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for gbart-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8a3cda10438675cb9f2c9ae2ccaffb66a6b60c86cdc8df9da651e7fab6fc0750
MD5 5b6a11fda7c03c5d989320c31d449b3d
BLAKE2b-256 0ecdee9d87fac2a2ecadbb37be57c7d028b56ff0bcffdbddb0ad6d92bd975e83

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page