Skip to main content

Package for efficiently parallelising zarr write operations based on awareness of source chunks

Reason this release was yanked:

Bug with cache directory name

Project description

Zarr Parallel Cacher

This package has been developed as part of the NERC EDS FRAME-FM AI project. It has been separated into its own module for ease of reusability across multiple projects. AI-specific steps may form part of the package, but may also be disabled by default.

Basic Usage

from zarr_parallel.assembler import ZarrParallelAssembler

zp = ZarrParallelAssembler(data_uri=uri, preprocessors=preprocessors,
            chunks=chunks,
            engine='kerchunk',
            variables={'d2m':{}}, 
            cache_label='_v1')

zp.cache(
    cache_dir='/gws/ssde/j25b/eds_ai/frame-fm/data/zarr_cache',
    deploy_mode='dask_distributed',
    simultaneous_worker_limit=4)

The above code snippet demonstrates the use of this package. The data_uri and engine parameters refer to the xarray open_dataset method for accessing the source object. chunks are required to specify the output chunking in the zarr cache, which is also required for organising the parallel jobs. variables is optional to add, and includes the ability to run transforms on specific data arrays (such as renaming) which are applied individually.

The preprocessors list defines the set of preprocessing transforms to apply to the dataset (including selection) at the point of caching. This should include all transforms that should be applied to the dataset before writing to the zarr cache.

The num_jobs and simultaneous_worker_limit parameters are used to configure for parallel deployment. If no num_jobs is provided, the assembler will calculate the optimal number of jobs for your memory limit (recommended). The default memory limit is 2GB and the timeout is set at 30 minutes, although this only applies to SLURM deployments at present.

Transforms/Preprocessors

Transformations to the data may be specified via the selector option passed in the above example. Xarray-native transformations are supported, as well as transforms from the FRAME-FM package if installed.

Selection Recommendations

The assembler will halt to recommend alternative data selections based on the underlying chunk structure. Proceeding without recommendations is not advised, as mismatched chunk-region borders may involve duplicating chunk requests and significantly increasing memory requirements per worker.

Version 0.3 Changes

  • Heartbeats between jobs in the dask workers.
  • Now able to shut off dask distributed info messages.
  • Added ability to add attributes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zarr_parallel-0.3.0.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zarr_parallel-0.3.0-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file zarr_parallel-0.3.0.tar.gz.

File metadata

  • Download URL: zarr_parallel-0.3.0.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.2 CPython/3.12.2 Linux/5.14.0-611.27.1.el9_7.x86_64

File hashes

Hashes for zarr_parallel-0.3.0.tar.gz
Algorithm Hash digest
SHA256 f82d7fdb2068743b086c387a4b23bbe57016179a44480abfdddafef989832ef3
MD5 ef77ab1dc54aae6a9e4961c7d445ce68
BLAKE2b-256 a1469c1f12b9ad11e17ac2a0e7f6419727abc480991cad9d7e04af1197f8a224

See more details on using hashes here.

File details

Details for the file zarr_parallel-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: zarr_parallel-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.2 CPython/3.12.2 Linux/5.14.0-611.27.1.el9_7.x86_64

File hashes

Hashes for zarr_parallel-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c6a46e9543ac361cb7021c46748eefe77f2114df99278eb178c7952a84a96eb7
MD5 8d6ce33362b05914ee1db1d392e2cd62
BLAKE2b-256 7fdb3fcd4ca372a107ba20f1762160e8dedd80b31c9cd3bc7cfe8db063143da0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page