Skip to main content

Dask Cluster objects in Saturn Cloud

Project description


Python library for interacting with Dask clusters in Saturn Cloud.

Dask-Saturn mimics the API of Dask-Kubernetes, but allows the user to interact with clusters created within Saturn Cloud.

Start cluster

From within a Jupyter notebook, you can start a cluster:

from dask_saturn import SaturnCluster

cluster = SaturnCluster()

By default this will start a dask cluster with the same settings that you have already set in the Saturn UI or in a prior notebook.

To start the cluster with a certain number of workers using the n_workers option. Similarly, you can set the scheduler_size, worker_size, and worker_is_spot.

Note: If the cluster is already running then you can't change the settings. Attempting to do so will raise a warning.

Use the autoclose option to set up a cluster that is tied to the client kernel. This functions like a regular dask LocalCluster, when your jupyter kernel dies or is restarted, the dask cluster will close.

Adjust number of workers

Once you have a cluster you can interact with it via the jupyter widget, or using the scale and adapt methods.

For example, to manually scale up to 20 workers:


To create an adaptive cluster that controls its own scaling:

cluster.adapt(minimum=1, maximum=20)

Interact with client

To submit tasks to the cluster, you sometimes need access to the Client object. Instantiate this with the cluster as the only argument:

from distributed import Client

client = Client(cluster)

Close cluster

To terminate all resources associated with a cluster, use the close method:


Change settings

To update the settings (such as n_workers, worker_size, worker_is_spot, nthreads) on an existing cluster, use the reset method:


You can also call this without instantiating the cluster first:

cluster = SaturnCluster.reset(n_workers=3)

Sync files to workers

When working with distributed dask clusters, the workers don't have access to the same file system as your client does. So you will see files in your jupyter server that aren't available on the workers. To move files to the workers you can use the RegisterFiles plugin and call sync_files on any path that you want to update on the workers.

For instance if you have a file structure like:

|---- utils/
|   |----
|   |----
|---- Untitled.ipynb

where contains:

# utils/
def greet():
    return "Hello"

If the code in changes or you add new files to utils, you'll want to push those changes to the workers. After setting up the SaturnCluster and the Client, register the RegisterFiles plugin with the workers. Then every time you make changes to the files in utils, run sync_files. The worker plugin makes sure that any new worker that comes up will have any files that you have synced.

from dask_saturn import RegisterFiles, sync_files

sync_files(client, "utils")

# If a python script has changed, restart the workers so they will see the changes

# import the function and tell the workers to run it
from util.hello import greet

TIP: You can always check the state of the filesystem on your workers by running


Create/update a dask-saturn conda environment:

make conda-update

Set environment variables to run dask-saturn with a local atlas server:

export BASE_URL=

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for dask-saturn, version 0.2.3
Filename, size File type Python version Upload date Hashes
Filename, size dask_saturn-0.2.3-py3-none-any.whl (14.4 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size dask-saturn-0.2.3.tar.gz (30.4 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page