Skip to main content

Monotonic composite quantile gradient boost regressor

Project description

MQBoost

A multiple quantiles estimation model that maintains the non-crossing condition (or monotone quantile condition) based on:

with the hyperparameter optimization framework Optuna.

Installation

Install using pip:

pip install mqboost

Usage

Features

  • MQRegressor: quantile regressor

Parameters

x         # Explanatory data (e.g. pd.DataFrame)
          # Column name '_tau' must be not included
y         # Response data (e.g. np.ndarray)
alphas    # Target quantiles
          # It must be in ascending order and not contain duplicates
objective # [Optional] objective to minimize, "check"(default) or "huber"
model     # [Optional] boost algorithm to use, "lightgbm"(default) or "xgboost"
delta     # [Optional] parameter in "huber" objective, only used when objective == "huber"
          # It must be smaller than 0.1

Methods

train           # train quantile model
                # Any params related to model can be used except "objective"
predict         # predict with input data
optimize_params # Optimize hyperparameter with using optuna

Example

import numpy as np
from mqboost import MQRegressor

## Generate sample
sample_size = 500
x = np.linspace(-10, 10, sample_size)
y = np.sin(x) + np.random.uniform(-0.4, 0.4, sample_size)
x_test = np.linspace(-10, 10, sample_size)
y_test = np.sin(x_test) + np.random.uniform(-0.4, 0.4, sample_size)

## target quantiles
alphas = [0.3, 0.4, 0.5, 0.6, 0.7]

## model name
model = "lightgbm" # "xgboost"

## objective funtion
objective = "huber" # "check"
delta = 0.01 # set when objective is huber default 0.05

## LightGBM based quantile regressor
mq_lgb = MQRegressor(
    x=x,
    y=y_test,
    alphas=alphas,
    objective=objective,
    model=model,
    delta=delta,
)

## train with fixed params
lgb_params = {
    "max_depth": 4,
    "num_leaves": 15,
    "learning_rate": 0.1,
    "boosting_type": "gbdt",
}
mq_lgb.train(params=lgb_params)

## train with optuna
mq_lgb.train(n_trials = 10) # the number of trials

## Same process
# best_params = mq_lgb.optimize_params(n_trials = 10)
# mq_lgb.train(params=best_params)

## predict
preds_lgb = mq_lgb.predict(x=x_test, alphas=alphas)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mqboost-0.1.0.tar.gz (7.5 kB view details)

Uploaded Source

Built Distribution

mqboost-0.1.0-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file mqboost-0.1.0.tar.gz.

File metadata

  • Download URL: mqboost-0.1.0.tar.gz
  • Upload date:
  • Size: 7.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.1 Linux/6.5.0-1023-azure

File hashes

Hashes for mqboost-0.1.0.tar.gz
Algorithm Hash digest
SHA256 24e20c768e2622d587f0258dc274bafa061af9b0838f6052d2e4a42b6bc9c1b4
MD5 1d24250aeadbe57bab1844f795171b60
BLAKE2b-256 6665f26d3df42495a4bf8433aa3d0bd85f2efb2b4a4ce0b57f97776b93c1dc73

See more details on using hashes here.

File details

Details for the file mqboost-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: mqboost-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 9.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.1 Linux/6.5.0-1023-azure

File hashes

Hashes for mqboost-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e89392668988da65b6b07b47c7d70da17d63768e47409a8aff5a992adce17462
MD5 194fa154b1210419c91c4b51037fc09a
BLAKE2b-256 6c28637acff5629f63f8d150279ad645562abfa1bfc387f8bceb62f98c1daa6b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page