Skip to main content

Monotonic composite quantile gradient boost regressor

Project description

MQBoost

Multiple quantiles estimation model maintaining non-crossing condition (or monotone quantile condition) using:

  • LightGBM
  • XGBoost

Installation

Install using pip:

pip install mqboost

Usage

Features

  • MQRegressor: quantile regressor

Parameters

x         # Explanatory data (e.g. pd.DataFrame)
          # Column name '_tau' must be not included
y         # Response data (e.g. np.ndarray)
alphas    # Target quantiles
          # It must be in ascending order and not contain duplicates
objective # [Optional] objective to minimize, "check"(default) or "huber"
model     # [Optional] boost algorithm to use, "lightgbm"(default) or "xgboost"
delta     # [Optional] parameter in "huber" objective, used when objective == "huber"
          # It must be smaller than 0.1

Methods

train     # train quantile model
          # Any params related to model can be used except "objective"
predict   # predict with input data

Example

import numpy as np
from mqboost import MQRegressor

## Generate sample
sample_size = 500
x = np.linspace(-10, 10, sample_size)
y = np.sin(x) + np.random.uniform(-0.4, 0.4, sample_size)
x_test = np.linspace(-10, 10, sample_size)
y_test = np.sin(x_test) + np.random.uniform(-0.4, 0.4, sample_size)

## target quantiles
alphas = [0.3, 0.4, 0.5, 0.6, 0.7]

## LightGBM based quantile regressor
mq_lgb = MQRegressor(
    x=x,
    y=y_test,
    alphas=alphas,
)
lgb_params = {
    "max_depth": 4,
    "num_leaves": 15,
    "learning_rate": 0.1,
    "boosting_type": "gbdt",
}
mq_lgb.train(params=lgb_params)
preds_lgb = mq_lgb.predict(x=x_test, alphas=alphas)

## XGBoost based quantile regressor
mq_xgb = MQRegressor(
    x=x,
    y=y_test,
    alphas=alphas,
    objective="check",
    model="xgboost",
)
xgb_params = {
    "learning_rate": 0.65,
    "max_depth": 10,
}
mq_xgb.train(params=xgb_params)
preds_xgb = mq_xgb.predict(x=x_test, alphas=alphas)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mqboost-0.0.1.tar.gz (4.7 kB view details)

Uploaded Source

Built Distribution

mqboost-0.0.1-py3-none-any.whl (6.2 kB view details)

Uploaded Python 3

File details

Details for the file mqboost-0.0.1.tar.gz.

File metadata

  • Download URL: mqboost-0.0.1.tar.gz
  • Upload date:
  • Size: 4.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.1 Linux/6.5.0-1023-azure

File hashes

Hashes for mqboost-0.0.1.tar.gz
Algorithm Hash digest
SHA256 3f77a5198b2c69e26a70490c2081fdceddab572f272ab10679398876ae4ba746
MD5 6edc0985aef2eb98b69742dfd3388db8
BLAKE2b-256 875130a446c02c2d99b04001bf7b8512f89b0f11c0669f787bb3dc16c19d49fb

See more details on using hashes here.

File details

Details for the file mqboost-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: mqboost-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 6.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.1 Linux/6.5.0-1023-azure

File hashes

Hashes for mqboost-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 eac1e7e1ceb728d5f961865db4097a6535d03815976ee890a9663cb9851adb36
MD5 3a09acf90d949345a687ba3438b2e143
BLAKE2b-256 4fc23c68cb8d18ac2440c5b788728b0b0a685acf57502e6d34981759961c9c00

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page