Skip to main content

Monotonic composite quantile gradient boost regressor

Project description

release PyPI

MQBoost introduces an advanced model for estimating multiple quantiles while ensuring the non-crossing condition (monotone quantile condition). This model harnesses the capabilities of both LightGBM and XGBoost, two leading gradient boosting frameworks.

By implementing the hyperparameter optimization prowess of Optuna, this model achieves great performance and precision. Optuna's optimization algorithms fine-tune the hyperparameters, ensuring the model operates efficiently.

Installation

Install using pip:

pip install mqboost

Usage

Features

  • MQRegressor: quantile regressor

Parameters

x         # Explanatory data (e.g. pd.DataFrame)
          # Column name '_tau' must be not included
y         # Response data (e.g. np.ndarray)
alphas    # Target quantiles
          # It must be in ascending order and not contain duplicates
objective # [Optional] objective to minimize, "check"(default) or "huber"
model     # [Optional] boost algorithm to use, "lightgbm"(default) or "xgboost"
delta     # [Optional] parameter in "huber" objective, only used when objective == "huber"
          # It must be smaller than 0.1

Methods

train           # train quantile model
                # Any params related to model can be used except "objective"
predict         # predict with input data
optimize_params # Optimize hyperparameter with using optuna

Example

import numpy as np
from mqboost import MQRegressor

# Generate sample data
sample_size = 500
x = np.linspace(-10, 10, sample_size)
y = np.sin(x) + np.random.uniform(-0.4, 0.4, sample_size)
x_test = np.linspace(-10, 10, sample_size)
y_test = np.sin(x_test) + np.random.uniform(-0.4, 0.4, sample_size)

# Define target quantiles
alphas = [0.3, 0.4, 0.5, 0.6, 0.7]

# Specify model type
model = "lightgbm"  # Options: "lightgbm" or "xgboost"

# Set objective function
objective = "huber"  # Options: "huber" or "check"
delta = 0.01  # Set when objective is "huber", default is 0.05

# Initialize the LightGBM-based quantile regressor
mq_lgb = MQRegressor(
    x=x,
    y=y_test,
    alphas=alphas,
    objective=objective,
    model=model,
    delta=delta,
)

# Train the model with fixed parameters
lgb_params = {
    "max_depth": 4,
    "num_leaves": 15,
    "learning_rate": 0.1,
    "boosting_type": "gbdt",
}
mq_lgb.train(params=lgb_params)

# Train the model with Optuna hyperparameter optimization
mq_lgb.train(n_trials=10)
# Alternatively, you can optimize parameters first and then train
# best_params = mq_lgb.optimize_params(n_trials=10)
# mq_lgb.train(params=best_params)

# Predict using the trained model
preds_lgb = mq_lgb.predict(x=x_test, alphas=alphas)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mqboost-0.1.3.tar.gz (8.5 kB view details)

Uploaded Source

Built Distribution

mqboost-0.1.3-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file mqboost-0.1.3.tar.gz.

File metadata

  • Download URL: mqboost-0.1.3.tar.gz
  • Upload date:
  • Size: 8.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.1 Linux/6.5.0-1023-azure

File hashes

Hashes for mqboost-0.1.3.tar.gz
Algorithm Hash digest
SHA256 73428afa29d7b14a65cb9b864348163c8378345ca72dc6725235347cabf173f8
MD5 0609d0f650def8b1f5fd2119369d1afb
BLAKE2b-256 73fa636f36ddff117ee4896f62580cd8856a5a982c87a4acb91f3861d743f93e

See more details on using hashes here.

File details

Details for the file mqboost-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: mqboost-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 9.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.1 Linux/6.5.0-1023-azure

File hashes

Hashes for mqboost-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 7d4fa3b58680964c86f022ad8fa6142f5be84409e130c041b0e67f4418499ab0
MD5 1ed2cbeed3271443bd439817df9182d1
BLAKE2b-256 e4b7b888d2c28f4edc2c004b97f9bf372942f3b99b48a8c19dd0a8ab58ce48de

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page