Skip to main content

Concurrent HDF5 and NetCDF4 reader (experimental)

Project description

Crates.io Documentation Build Status codecov Rust nightly

HIDEFIX

This library provides an alternative reader for the HDF5 file or NetCDF4 file (which uses HDF5) which supports concurrent access to data. This is achieved by building an index of the chunks, allowing a thread to use many file handles to read the file. The original (native) HDF5 library is used to build the index, but once it has been created it is no longer needed. The index can be serialized to disk so that performing the indexing is not necessary.

use hidefix::prelude::*;

let idx = Index::index("tests/data/coads_climatology.nc4").unwrap();
let mut r = idx.reader("SST").unwrap();

let values = r.values::<f32>(None, None).unwrap();

println!("SST: {:?}", values);

Motivation

The HDF5 library requires internal locks to be thread-safe since it relies on internal buffers which cannot be safely accessed/written to from multiple threads. This effectively causes multi-threaded applications to use sequential reads, while competing for the locks. And also apparently cause each other trouble, perhaps through dropping cached chunks which other threads still need. It can be safely used from different processes, but that requires potentially much more overhead than multi-threaded or asynchronous code.

Some basic benchmarks

hidefix is intended to perform better when concurrent reads are made either to the same dataset, same file or to different files from a single process. For basic benchmarks the performance is on-par or slightly better compared to doing standard sequential reads than the native HDF5 library (through its rust-bindings). Where hidefix shines is once the multiple threads in the same process tries to read in any way from a HDF5 file simultaneously.

This simple benchmark tries to read a small dataset sequentially or concurrently using the cached reader from hidefix and the native reader from HDF5. The dataset is chunked, shuffled and compressed (using gzip):

$ cargo bench --bench concurrency -- --ignored

test shuffled_compressed::cache_concurrent_reads  ... bench:  15,903,406 ns/iter (+/- 220,824)
test shuffled_compressed::cache_sequential        ... bench:  59,778,761 ns/iter (+/- 602,316)
test shuffled_compressed::native_concurrent_reads ... bench: 411,605,868 ns/iter (+/- 35,346,233)
test shuffled_compressed::native_sequential       ... bench: 103,457,237 ns/iter (+/- 7,703,936)

Inspiration and other projects

This work is based in part on the DMR++ module of the OPeNDAP Hyrax server. The zarr format does something similar, and the same approach has been tested out on HDF5 as swell.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hidefix-0.6.1.tar.gz (9.1 MB view details)

Uploaded Source

Built Distribution

hidefix-0.6.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (8.9 MB view details)

Uploaded CPython 3.7+manylinux: glibc 2.17+ x86-64

File details

Details for the file hidefix-0.6.1.tar.gz.

File metadata

  • Download URL: hidefix-0.6.1.tar.gz
  • Upload date:
  • Size: 9.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for hidefix-0.6.1.tar.gz
Algorithm Hash digest
SHA256 9ac6c81783e1d8c401c4c5bd2d2df1f6f85bc91de32889d54acf62506bbd6910
MD5 f452213a59f089e67177d42f5add7b76
BLAKE2b-256 41653144b36f4be22a03ed035659548987fbed359a412117bb43b3b5c1222310

See more details on using hashes here.

File details

Details for the file hidefix-0.6.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for hidefix-0.6.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 27e7307348475cf94b2387fee7bf0fe96d67c58be1fb69029ee469da3661f1b5
MD5 ee3b4f1be4b45f091c93fb900672400c
BLAKE2b-256 34315db9ec1325e5ef4646f5b836d4722deba458796ad3752cc9d98ea776263c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page