Skip to main content

Utilities for expanding dask-jobqueue with appropriate settings for NCAR's clusters

Project description

ncar-jobqueue

Badges

CI GitHub Workflow Status GitHub Workflow Status Code Coverage Status
Docs Documentation Status
Package Conda PyPI
License License

Utilities for expanding dask-jobqueue with appropriate settings for NCAR's clusters.

Supported clusters:

  • Cheyenne
  • Casper (DAV)
  • CGD's Hobart
  • CGD's Izumi

Installation

NCAR-jobqueue can be installed from PyPI with pip:

python -m pip install ncar-jobqueue

NCAR-jobqueue is also available from conda-forge for conda installations:

conda install -c conda-forge ncar-jobqueue

Usage

Casper

>>> from ncar_jobqueue import NCARCluster
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster(cores=0, memory=0 B, workers=0/0, jobs=0/0)
>>> cluster.scale(jobs=2)
>>> cluster
NCARCluster(cores=2, memory=50.00 GB, workers=2/2, jobs=2/2)
>>> client = Client(cluster)

Cheyenne

>>> from ncar_jobqueue import NCARCluster
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster(cores=0, memory=0 B, workers=0/0, jobs=0/0)
>>> cluster.scale(jobs=2)
>>> cluster
NCARCluster(cores=72, memory=218.00 GB, workers=2/2, jobs=2/2)
>>> client = Client(cluster)

Hobart

>>> from ncar_jobqueue import NCARCluster
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster(cores=0, memory=0 B, workers=0/0, jobs=0/0)
>>> cluster.scale(jobs=2)
>>> cluster
NCARCluster(cores=96, memory=192.00 GB, workers=2/2, jobs=2/2)
>>> client = Client(cluster)

Izumi

>>> from ncar_jobqueue import NCARCluster
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster(cores=0, memory=0 B, workers=0/0, jobs=0/0)
>>> cluster.scale(jobs=2)
>>> cluster
NCARCluster(cores=96, memory=192.00 GB, workers=2/2, jobs=2/2)
>>> client = Client(cluster)

Non-NCAR machines

On non-NCAR machines, ncar-jobqueue will warn the user, and it will use distributed.LocalCluster:

>>> from ncar_jobqueue import NCARCluster
.../ncar_jobqueue/cluster.py:42: UserWarning: Unable to determine which NCAR cluster you are running on... Returning a `distributed.LocalCluster` class.
warn(message)
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster('tcp://127.0.0.1:49334', workers=4, threads=8, memory=17.18 GB)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ncar-jobqueue-2020.12.4.tar.gz (14.7 kB view details)

Uploaded Source

Built Distribution

ncar_jobqueue-2020.12.4-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file ncar-jobqueue-2020.12.4.tar.gz.

File metadata

  • Download URL: ncar-jobqueue-2020.12.4.tar.gz
  • Upload date:
  • Size: 14.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.54.0 CPython/3.9.0

File hashes

Hashes for ncar-jobqueue-2020.12.4.tar.gz
Algorithm Hash digest
SHA256 f62121b3d8722f76bf84b5ad43218a9faa504593c1bc94cdb948eaf0268aa301
MD5 5cbbd05f5643257124afb111a4a0e73c
BLAKE2b-256 787c73ca554895aacb8ecdb85d1485118b090a19a4fd978a744fcdb77b665d93

See more details on using hashes here.

File details

Details for the file ncar_jobqueue-2020.12.4-py3-none-any.whl.

File metadata

  • Download URL: ncar_jobqueue-2020.12.4-py3-none-any.whl
  • Upload date:
  • Size: 11.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.54.0 CPython/3.9.0

File hashes

Hashes for ncar_jobqueue-2020.12.4-py3-none-any.whl
Algorithm Hash digest
SHA256 8fcf34d9649cf313604c992cf8eb187faf9e18b0deaaee5a96ed61d400eaf1f2
MD5 277dc8da2411879d89ed01f09b94e23b
BLAKE2b-256 e4aae8411647bf057cc0c1c94cfa66e407843faae39d13bc2e1c1006a0987278

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page