Skip to main content

Package for efficiently parallelising zarr write operations based on awareness of source chunks

Project description

Zarr Parallel Cacher

This package has been developed as part of the NERC EDS FRAME-FM AI project. It has been separated into its own module for ease of reusability across multiple projects. AI-specific steps may form part of the package, but may also be disabled by default.

Basic Usage

from zarr_parallel.assembler import ZarrParallelAssembler

zp = ZarrParallelAssembler(data_uri=uri, preprocessors=preprocessors,
            chunks=chunks,
            engine='kerchunk',
            variables={'d2m':{}}, 
            cache_label='_v1')

zp.cache(
    cache_dir='/gws/ssde/j25b/eds_ai/frame-fm/data/zarr_cache',
    deploy_mode='dask_distributed',
    simultaneous_worker_limit=4)

The above code snippet demonstrates the use of this package. The data_uri and engine parameters refer to the xarray open_dataset method for accessing the source object. chunks are required to specify the output chunking in the zarr cache, which is also required for organising the parallel jobs. variables is optional to add, and includes the ability to run transforms on specific data arrays (such as renaming) which are applied individually.

The preprocessors list defines the set of preprocessing transforms to apply to the dataset (including selection) at the point of caching. This should include all transforms that should be applied to the dataset before writing to the zarr cache.

The num_jobs and simultaneous_worker_limit parameters are used to configure for parallel deployment. If no num_jobs is provided, the assembler will calculate the optimal number of jobs for your memory limit (recommended). The default memory limit is 2GB and the timeout is set at 30 minutes, although this only applies to SLURM deployments at present.

Transforms/Preprocessors

Transformations to the data may be specified via the selector option passed in the above example. Xarray-native transformations are supported, as well as transforms from the FRAME-FM package if installed.

Selection Recommendations

The assembler will halt to recommend alternative data selections based on the underlying chunk structure. Proceeding without recommendations is not advised, as mismatched chunk-region borders may involve duplicating chunk requests and significantly increasing memory requirements per worker.

Version 0.3 Changes

  • Heartbeats between jobs in the dask workers.
  • Now able to shut off dask distributed info messages.
  • Added ability to add attributes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zarr_parallel-0.3.2.tar.gz (14.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zarr_parallel-0.3.2-py3-none-any.whl (18.2 kB view details)

Uploaded Python 3

File details

Details for the file zarr_parallel-0.3.2.tar.gz.

File metadata

  • Download URL: zarr_parallel-0.3.2.tar.gz
  • Upload date:
  • Size: 14.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.2 CPython/3.12.2 Linux/5.14.0-611.27.1.el9_7.x86_64

File hashes

Hashes for zarr_parallel-0.3.2.tar.gz
Algorithm Hash digest
SHA256 dbff0c77ca3ac18bbc58aecf9dae9eca83d2aa91f6aab3366ef08266a2de1a76
MD5 19d953f1f3211321e493ac3cd83abc38
BLAKE2b-256 24f936bd7390fc200f9c368929796106410a29587ff67e24cb0e4ac3a488241b

See more details on using hashes here.

File details

Details for the file zarr_parallel-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: zarr_parallel-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 18.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.2 CPython/3.12.2 Linux/5.14.0-611.27.1.el9_7.x86_64

File hashes

Hashes for zarr_parallel-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 40f780d00a30f23aa4d5fbd81b5b74c7e2f27f25b37b0e76f596bd799babc184
MD5 64dfc3fce68392b290b0adfb531baa8c
BLAKE2b-256 35fa0459e9efa86c1325a53dfbec22133dc70093e9bfcd46a5b6d078aed6d0aa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page