Skip to main content

Interactions between Dask and XGBoost

Project description

Distributed training with XGBoost and Dask.distributed

This repository offers a legacy option to perform distributed training with XGBoost on Dask.array and Dask.dataframe collections.

pip install dask-xgboost

Please note that XGBoost now includes a Dask API as part of its official Python package. That API is independent of dask-xgboost and is now the recommended way to use Dask adn XGBoost together. See the xgb.dask documentation here https://xgboost.readthedocs.io/en/latest/tutorials/dask.html for more details on the new API.

Example

from dask.distributed import Client
client = Client('scheduler-address:8786')  # connect to cluster

import dask.dataframe as dd
df = dd.read_csv('...')  # use dask.dataframe to load and
df_train = ...           # preprocess data
labels_train = ...

import dask_xgboost as dxgb
params = {'objective': 'binary:logistic', ...}  # use normal xgboost params
bst = dxgb.train(client, params, df_train, labels_train)

>>> bst  # Get back normal XGBoost result
<xgboost.core.Booster at ... >

predictions = dxgb.predict(client, bst, data_test)

How this works

For more information on using Dask.dataframe for preprocessing see the Dask.dataframe documentation.

Once you have created suitable data and labels we are ready for distributed training with XGBoost. Every Dask worker sets up an XGBoost slave and gives them enough information to find each other. Then Dask workers hand their in-memory Pandas dataframes to XGBoost (one Dask dataframe is just many Pandas dataframes spread around the memory of many machines). XGBoost handles distributed training on its own without Dask interference. XGBoost then hands back a single xgboost.Booster result object.

Larger Example

For a more serious example see

History

Conversation during development happened at dmlc/xgboost #2032

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dask-xgboost-0.2.0.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

dask_xgboost-0.2.0-py2.py3-none-any.whl (14.1 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file dask-xgboost-0.2.0.tar.gz.

File metadata

  • Download URL: dask-xgboost-0.2.0.tar.gz
  • Upload date:
  • Size: 18.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.7.10

File hashes

Hashes for dask-xgboost-0.2.0.tar.gz
Algorithm Hash digest
SHA256 6d9c491dc4099f74a0df66c4d439d296c0f1fba97009fe93e21b2350f295b4ca
MD5 5ead54141baa7215b086b1448b866613
BLAKE2b-256 74a337471a21f7e13ba23823eb837cee62fd31ba77f489ebe381cd9a6ed764e3

See more details on using hashes here.

File details

Details for the file dask_xgboost-0.2.0-py2.py3-none-any.whl.

File metadata

  • Download URL: dask_xgboost-0.2.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 14.1 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.7.10

File hashes

Hashes for dask_xgboost-0.2.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 47b7d96e981d8d7aa81bd15578f470d80ee92ae5aac122adc3bc7e1c9f941682
MD5 d9d8e5757ad693d10763307767f5b432
BLAKE2b-256 0d3390fec71df94921d9e604c1f3812b7bb9573ce93aec0637df8a319a7ea42b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page