Skip to main content

LightGBM distributed training on Dask

Project description

Dask-LightGBM

Build Status

Distributed training with LightGBM and Dask.distributed

This repository enables you to perform distributed training with LightGBM on Dask.Array and Dask.DataFrame collections. It is based on dask-xgboost package.

Usage

Load your data into distributed data-structure, which can be either Dask.Array or Dask.DataFrame. Connect to a Dask cluster using Dask.distributed.Client. Let dask-lightgbm train a model or make predictions for you. See system tests for a sample code: https://github.com/dask/dask-lightgbm/blob/master/system_tests/test_fit_predict.py

How this works

Dask is used mainly for accessing the cluster and managing data. The library assures that both features and a label for each sample are located on the same worker. It also lets each worker to know addresses and available ports of all other workers. The distributed training is performed by LightGBM library itself using sockets. See more details on distributed training in LightGBM here: https://github.com/microsoft/LightGBM/blob/master/docs/Parallel-Learning-Guide.rst

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for dask-lightgbm, version 0.1.0
Filename, size File type Python version Upload date Hashes
Filename, size dask_lightgbm-0.1.0-py3-none-any.whl (5.6 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size dask-lightgbm-0.1.0.tar.gz (5.6 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page