treeboost_autograd
Project description
treeboost_autograd
Easy Custom Losses for Tree Boosters using Pytorch
Why calculate first and second derivatives for your objective when you can let PyTorch do it for you?
This packages includes an easy to use custom PyTorch objective implementation for tree boosters (just add loss).
Supported boosting packages: CatBoost, XGBoost, LightGBM.
Supported tasks: regression, binary classification.
Check out the post in Towards Data Science: https://towardsdatascience.com/easy-custom-losses-for-tree-boosters-using-pytorch-57ffaa0b2eb3
Usage
Usage is very similar for all boosting libraries:
from treeboost_autograd import CatboostObjective, LightGbmObjective, XgboostObjective
Ready-to-run examples are available at the Git repo: https://github.com/TomerRonen34/treeboost_autograd/tree/main/examples
pip install treeboost_autograd
def absolute_error_loss(preds: torch.Tensor, targets: torch.Tensor) -> torch.Tensor:
return torch.abs(preds - targets).sum()
custom_objective = CatboostObjective(loss_function=absolute_error_loss)
model = CatBoostRegressor(loss_function=custom_objective, eval_metric="MAE")
model.fit(X_train, y_train)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for treeboost_autograd-0.1.2-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f3862331d2bd441b4b0c621231a660e9fac1ef04a92bed1bba18068004c27454 |
|
MD5 | 8f15c414a7cd4543cfca375f7f90ac8f |
|
BLAKE2b-256 | 618b80cbe09bd03ac1579770017447da78cfe1f72578190996ca61670a05ebbe |