Skip to main content

Transparent optimized reading of n-dimensional Blosc2 slices for h5py

Project description

b2h5py provides h5py with transparent, automatic optimized reading of n-dimensional slices of Blosc2-compressed datasets. This optimized slicing leverages direct chunk access (skipping the slow HDF5 filter pipeline) and 2-level partitioning into chunks and then smaller blocks (so that less data is actually decompressed).

Benchmarks of this technique show 2x-5x speed-ups compared with normal filter-based access. Comparable results are obtained with a similar technique in PyTables, see Optimized Hyper-slicing in PyTables with Blosc2 NDim.

doc/benchmark.png

Usage

This optimized access works for slices with step 1 on Blosc2-compressed datasets using the native byte order. It is enabled by monkey-patching the h5py.Dataset class to extend the slicing operation. The easiest way to do this is:

import b2h5py.auto

After that, optimization will be attempted for any slicing of a dataset (of the form dataset[...] or dataset.__getitem__(...)). If the optimization is not possible in a particular case, normal h5py slicing code will be used (which performs HDF5 filter-based access, backed by hdf5plugin to support Blosc2).

You may instead just import b2h5py and explicitly enable the optimization globally by calling b2h5py.enable_fast_slicing(), and disable it again with b2h5py.disable_fast_slicing(). You may also enable it temporarily by using a context manager:

with b2h5py.fast_slicing():
    # ... code that will use Blosc2 optimized slicing ...

Finally, you may explicitly enable optimizations for a given h5py dataset by wrapping it in a B2Dataset instance:

b2dset = b2h5py.B2Dataset(dset)
# ... slicing ``b2dset`` will use Blosc2 optimization ...

Building

Just install PyPA build (e.g. pip install build), enter the source code directory and run pyproject-build to get a source tarball and a wheel under the dist directory.

Installing

To install as a wheel from PyPI, run pip install b2h5py.

You may also install the wheel that you built in the previous section, or enter the source code directory and run pip install . from there.

Running tests

If you have installed b2h5py, just run python -m unittest discover b2h5py.tests.

Otherwise, just enter its source code directory and run python -m unittest.

You can also run the h5py tests with the patched Dataset class to check that patching does not break anything. You may install the h5py-test extra (e.g. pip install b2h5py[h5py-test] and run python -m b2h5py.tests.test_patched_h5py.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

b2h5py-0.5.1.tar.gz (15.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

b2h5py-0.5.1-py3-none-any.whl (16.6 kB view details)

Uploaded Python 3

File details

Details for the file b2h5py-0.5.1.tar.gz.

File metadata

  • Download URL: b2h5py-0.5.1.tar.gz
  • Upload date:
  • Size: 15.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.1

File hashes

Hashes for b2h5py-0.5.1.tar.gz
Algorithm Hash digest
SHA256 1426d4f5b99e9bd032a75a9f989cfadc1e9e62238cd967bb5fb68ef9f4cc9797
MD5 7799baccab7ab0f57730a9c83098f5bf
BLAKE2b-256 f67e5a6b1d4d4d8110760c4591be90e492f9cecee8ab8b0f503379f8f45c48a5

See more details on using hashes here.

File details

Details for the file b2h5py-0.5.1-py3-none-any.whl.

File metadata

  • Download URL: b2h5py-0.5.1-py3-none-any.whl
  • Upload date:
  • Size: 16.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.1

File hashes

Hashes for b2h5py-0.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8f0db2b733cf1d8f0b9fd778179c115fce3a5a5857cec1fb832bf492f72b6d5e
MD5 9d43120c939d4c43b750d731381338e6
BLAKE2b-256 4723ec22a4860d90ccd82ed33df231a04e93734de58dfba67a16d0710f9a4638

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page