Skip to main content

Deploy Dask on job queuing systems like PBS or SLURM

Project description

This helps to deploy Dask on batch-style job schedulers like PBS and SLURM.


from dask_jobqueue import PBSCluster

cluster = PBSCluster(processes=6, threads=4, memory="16GB")

from dask.distributed import Client
client = Client(cluster)


This can also adapt the cluster size dynamically based on current load. This helps to scale up the cluster when necessary but scale it down and save resources when not actively computing.



This package came out of the Pangeo collaboration and was copy-pasted from a live repository at this commit. Unfortunately, development history was not preserved.

Original developers include the following:

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for dask-jobqueue, version 0.1.0
Filename, size File type Python version Upload date Hashes
Filename, size dask_jobqueue-0.1.0-py3-none-any.whl (11.9 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size dask-jobqueue-0.1.0.tar.gz (23.9 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page