Skip to main content

Monotonic composite quantile gradient boost regressor

Project description

MQBoost

Multiple quantiles estimation model maintaining non-crossing condition (or monotone quantile condition) using:

  • LightGBM
  • XGBoost

Installation

Install using pip:

pip install mqboost

Usage

Features

  • MQRegressor: quantile regressor

Parameters

x         # Explanatory data (e.g. pd.DataFrame)
          # Column name '_tau' must be not included
y         # Response data (e.g. np.ndarray)
alphas    # Target quantiles
          # It must be in ascending order and not contain duplicates
objective # [Optional] objective to minimize, "check"(default) or "huber"
model     # [Optional] boost algorithm to use, "lightgbm"(default) or "xgboost"
delta     # [Optional] parameter in "huber" objective, used when objective == "huber"
          # It must be smaller than 0.1

Methods

train     # train quantile model
          # Any params related to model can be used except "objective"
predict   # predict with input data

Example

import numpy as np
from mqboost import MQRegressor

## Generate sample
sample_size = 500
x = np.linspace(-10, 10, sample_size)
y = np.sin(x) + np.random.uniform(-0.4, 0.4, sample_size)
x_test = np.linspace(-10, 10, sample_size)
y_test = np.sin(x_test) + np.random.uniform(-0.4, 0.4, sample_size)

## target quantiles
alphas = [0.3, 0.4, 0.5, 0.6, 0.7]

## model name
model = "lightgbm" # "xgboost"

## objective funtion
objective = "huber" # "check"
delta = 0.01 # set when objective is huber default 0.05

## LightGBM based quantile regressor
mq_lgb = MQRegressor(
    x=x,
    y=y_test,
    alphas=alphas,
    objective=objective,
    model=model,
    delta=delta,
)

## train
lgb_params = {
    "max_depth": 4,
    "num_leaves": 15,
    "learning_rate": 0.1,
    "boosting_type": "gbdt",
}
mq_lgb.train(params=lgb_params)

## predict
preds_lgb = mq_lgb.predict(x=x_test, alphas=alphas)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mqboost-0.0.2.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

mqboost-0.0.2-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file mqboost-0.0.2.tar.gz.

File metadata

  • Download URL: mqboost-0.0.2.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.1 Linux/6.5.0-1023-azure

File hashes

Hashes for mqboost-0.0.2.tar.gz
Algorithm Hash digest
SHA256 9216892d5065496500e1c63018fcef5597a0a82d97f7297bcee0a4c5768e1660
MD5 4faa8411aeb896920d521d7eb8b0fa63
BLAKE2b-256 fae7373cc3e8a729ffc788ea754c9863a31b2fd9dc70c7535bf970c1114bcd49

See more details on using hashes here.

File details

Details for the file mqboost-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: mqboost-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 6.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.1 Linux/6.5.0-1023-azure

File hashes

Hashes for mqboost-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 159c6a92856fa7e9b34be756a25306b96660c9cf34cf8340483da2b5a711714b
MD5 99d05efd46dc07dd07f28315f94631b5
BLAKE2b-256 c79d9e1a83695616836a0d39f7eadcdf151d73142d8a7a639829d3f504e850a9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page