Skip to main content

Utilities for expanding dask-jobqueue with appropriate settings for NCAR's clusters

Project description

ncar-jobqueue

Badges

CI GitHub Workflow Status GitHub Workflow Status Code Coverage Status
Docs Documentation Status
Package Conda PyPI
License License

Utilities for expanding dask-jobqueue with appropriate settings for NCAR's clusters.

Supported clusters:

  • Cheyenne
  • Casper (DAV)
  • CGD's Hobart
  • CGD's Izumi

Installation

NCAR-jobqueue can be installed from PyPI with pip:

python -m pip install ncar-jobqueue

NCAR-jobqueue is also available from conda-forge for conda installations:

conda install -c conda-forge ncar-jobqueue

Usage

Casper

>>> from ncar_jobqueue import NCARCluster
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster(cores=0, memory=0 B, workers=0/0, jobs=0/0)
>>> cluster.scale(jobs=2)
>>> cluster
NCARCluster(cores=2, memory=50.00 GB, workers=2/2, jobs=2/2)
>>> client = Client(cluster)

Cheyenne

>>> from ncar_jobqueue import NCARCluster
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster(cores=0, memory=0 B, workers=0/0, jobs=0/0)
>>> cluster.scale(jobs=2)
>>> cluster
NCARCluster(cores=72, memory=218.00 GB, workers=2/2, jobs=2/2)
>>> client = Client(cluster)

Hobart

>>> from ncar_jobqueue import NCARCluster
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster(cores=0, memory=0 B, workers=0/0, jobs=0/0)
>>> cluster.scale(jobs=2)
>>> cluster
NCARCluster(cores=96, memory=192.00 GB, workers=2/2, jobs=2/2)
>>> client = Client(cluster)

Izumi

>>> from ncar_jobqueue import NCARCluster
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster(cores=0, memory=0 B, workers=0/0, jobs=0/0)
>>> cluster.scale(jobs=2)
>>> cluster
NCARCluster(cores=96, memory=192.00 GB, workers=2/2, jobs=2/2)
>>> client = Client(cluster)

Non-NCAR machines

On non-NCAR machines, ncar-jobqueue will warn the user, and it will use distributed.LocalCluster:

>>> from ncar_jobqueue import NCARCluster
.../ncar_jobqueue/cluster.py:42: UserWarning: Unable to determine which NCAR cluster you are running on... Returning a `distributed.LocalCluster` class.
warn(message)
>>> from dask.distributed import Client
>>> cluster = NCARCluster()
>>> cluster
NCARCluster('tcp://127.0.0.1:49334', workers=4, threads=8, memory=17.18 GB)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ncar-jobqueue-2021.2.10.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

ncar_jobqueue-2021.2.10-py3-none-any.whl (12.3 kB view details)

Uploaded Python 3

File details

Details for the file ncar-jobqueue-2021.2.10.tar.gz.

File metadata

  • Download URL: ncar-jobqueue-2021.2.10.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.56.2 CPython/3.8.7

File hashes

Hashes for ncar-jobqueue-2021.2.10.tar.gz
Algorithm Hash digest
SHA256 dce3c0e2259de65143ac8d758ffbd8576c49b7627d5530b1e2a5925ef9e053b7
MD5 a8098671ed90d2ca49be275153b588da
BLAKE2b-256 1b30dd7aaf463a5cc47a07a017d996035b0cacdec220e4a7866c5dffc20577f0

See more details on using hashes here.

File details

Details for the file ncar_jobqueue-2021.2.10-py3-none-any.whl.

File metadata

  • Download URL: ncar_jobqueue-2021.2.10-py3-none-any.whl
  • Upload date:
  • Size: 12.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.56.2 CPython/3.8.7

File hashes

Hashes for ncar_jobqueue-2021.2.10-py3-none-any.whl
Algorithm Hash digest
SHA256 5680d0313cf84371ac139b086007338229668ad37ed2967ddb6f32c0aaec00d4
MD5 4ae51594f5b19c8ad45f6ff44044f464
BLAKE2b-256 4d2341cc1a0fad74673ce8976b588e51980a883807b5ccd29326ca222682f7e3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page