Skip to main content

XGBoost for probabilistic prediction.

Project description

https://github.com/CDonnerer/xgboost-distribution/actions/workflows/test.yml/badge.svg?branch=main https://coveralls.io/repos/github/CDonnerer/xgboost-distribution/badge.svg?branch=main https://img.shields.io/badge/code%20style-black-000000.svg Documentation Status PyPI-Server

xgboost-distribution

XGBoost for probabilistic prediction. Like NGBoost, but faster, and in the XGBoost scikit-learn API.

XGBDistribution example

Installation

$ pip install xgboost-distribution

Dependencies:

python_requires = >=3.8

install_requires =
    scikit-learn
    xgboost>=1.7.0

Usage

XGBDistribution follows the XGBoost scikit-learn API, with an additional keyword argument specifying the distribution (see the documentation for a full list of available distributions):

from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split

from xgboost_distribution import XGBDistribution


data = load_boston()
X, y = data.data, data.target
X_train, X_test, y_train, y_test = train_test_split(X, y)

model = XGBDistribution(
    distribution="normal",
    n_estimators=500,
    early_stopping_rounds=10
)
model.fit(X_train, y_train, eval_set=[(X_test, y_test)])

After fitting, we can predict the parameters of the distribution:

preds = model.predict(X_test)
mean, std = preds.loc, preds.scale

Note that this returned a namedtuple of numpy arrays for each parameter of the distribution (we use the scipy stats naming conventions for the parameters, see e.g. scipy.stats.norm for the normal distribution).

NGBoost performance comparison

XGBDistribution follows the method shown in the NGBoost library, using natural gradients to estimate the parameters of the distribution.

Below, we show a performance comparison of XGBDistribution with the NGBoost NGBRegressor, using the Boston Housing dataset, estimating normal distributions. We note that while the performance of the two models is essentially identical (measured on negative log-likelihood of a normal distribution and the RMSE), XGBDistribution is 30x faster (timed on both fit and predict steps):

XGBDistribution vs NGBoost

Please see the experiments page in the documentation for detailed results across various datasets.

Full XGBoost features

XGBDistribution offers the full set of XGBoost features available in the XGBoost scikit-learn API, allowing, for example, probabilistic regression with monotonic constraints:

XGBDistribution monotonic constraints

Acknowledgements

This package would not exist without the excellent work from:

  • NGBoost - Which demonstrated how gradient boosting with natural gradients can be used to estimate parameters of distributions. Much of the gradient calculations code were adapted from there.

  • XGBoost - Which provides the gradient boosting algorithms used here, in particular the sklearn APIs were taken as a blue-print.

Note

This project has been set up using PyScaffold 4.0.1. For details and usage information on PyScaffold see https://pyscaffold.org/.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xgboost-distribution-0.2.6.tar.gz (208.9 kB view details)

Uploaded Source

Built Distribution

xgboost_distribution-0.2.6-py2.py3-none-any.whl (17.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file xgboost-distribution-0.2.6.tar.gz.

File metadata

  • Download URL: xgboost-distribution-0.2.6.tar.gz
  • Upload date:
  • Size: 208.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.16

File hashes

Hashes for xgboost-distribution-0.2.6.tar.gz
Algorithm Hash digest
SHA256 8116ceeb63a9b236ccf8942cf37de0e1717fde82a8e1966cceb97f8268b11eb9
MD5 eb2ee289ab35339b8c5c2c9ef84ff9ef
BLAKE2b-256 b161f8083588be42f837c477df90bc25f5e9aea1e5c670dc3de4cca28521eded

See more details on using hashes here.

File details

Details for the file xgboost_distribution-0.2.6-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for xgboost_distribution-0.2.6-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 4fe32e96da970d349defb21a13a32967de0218efb6cf1143e4499e20bebe10e0
MD5 e64491dbe4b95cca67e35fa58e420f0e
BLAKE2b-256 807caef0c4f56b3946a033468901ad2d2bfc25affd67d3586a28749cab7570f6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page