Skip to main content

HDF5 Plugins for windows,MacOS and linux

Project description

This module provides HDF5 compression filters (namely: blosc, bitshuffle, lz4, FCIDECOMP, ZFP) and registers them to the HDF5 library used by h5py.

  • Supported operating systems: Linux, Windows, macOS.
  • Supported versions of Python: 2.7 and >= 3.4

hdf5plugin provides a generic way to enable the use of the provided HDF5 compression filters with h5py that can be installed via pip or conda.

Alternatives to install HDF5 compression filters are: system-wide installation on Linux or other conda packages: blosc-hdf5-plugin, hdf5-lz4.

The HDF5 plugin sources were obtained from:

Installation

To install, run:

pip install hdf5plugin [--user]

or, with conda (https://anaconda.org/conda-forge/hdf5plugin):

conda install -c conda-forge hdf5plugin

To install from source and recompile the HDF5 plugins, run:

pip install hdf5plugin --no-binary hdf5plugin [--user]

Installing from source can achieve better performances by enabling AVX2 and OpenMP if available.

Documentation

To use it, just use import hdf5plugin and supported compression filters are available from h5py.

Sample code:

import numpy
import h5py
import hdf5plugin

# Compression
f = h5py.File('test.h5', 'w')
f.create_dataset('data', data=numpy.arange(100), **hdf5plugin.LZ4())
f.close()

# Decompression
f = h5py.File('test.h5', 'r')
data = f['data'][()]
f.close()

hdf5plugin provides:

Bitshuffle(nelems=0, lz4=True)

This class takes the following arguments and returns the compression options to feed into h5py.Group.create_dataset for using the bitshuffle filter:

  • nelems the number of elements per block, needs to be divisible by eight (default is 0, about 8kB per block)
  • lz4 if True the elements get compressed using lz4 (default is True)

It can be passed as keyword arguments.

Sample code:

f = h5py.File('test.h5', 'w')
f.create_dataset('bitshuffle_with_lz4', data=numpy.arange(100),
      **hdf5plugin.Bitshuffle(nelems=0, lz4=True))
f.close()

Blosc(cname=’lz4’, clevel=5, shuffle=SHUFFLE)

This class takes the following arguments and returns the compression options to feed into h5py.Group.create_dataset for using the blosc filter:

  • cname the compression algorithm, one of:
    • ‘blosclz’
    • ‘lz4’ (default)
    • ‘lz4hc’
    • ‘snappy’ (optional, requires C++11)
    • ‘zlib’
    • ‘zstd’
  • clevel the compression level, from 0 to 9 (default is 5)
  • shuffle the shuffling mode, in:
    • Blosc.NOSHUFFLE (0): No shuffle
    • Blosc.SHUFFLE (1): byte-wise shuffle (default)
    • Blosc.BITSHUFFLE (2): bit-wise shuffle

It can be passed as keyword arguments.

Sample code:

f = h5py.File('test.h5', 'w')
f.create_dataset('blosc_byte_shuffle_blosclz', data=numpy.arange(100),
    **hdf5plugin.Blosc(cname='blosclz', clevel=9, shuffle=hdf5plugin.Blosc.SHUFFLE))
f.close()

FciDecomp()

This class returns the compression options to feed into h5py.Group.create_dataset for using the FciDecomp filter:

It can be passed as keyword arguments.

Sample code:

f = h5py.File('test.h5', 'w')
f.create_dataset('fcidecomp', data=numpy.arange(100),
    **hdf5plugin.FciDecomp())
f.close()

LZ4(nbytes=0)

This class takes the number of bytes per block as argument and returns the compression options to feed into h5py.Group.create_dataset for using the lz4 filter:

  • nbytes number of bytes per block needs to be in the range of 0 < nbytes < 2113929216 (1,9GB). The default value is 0 (for 1GB).

It can be passed as keyword arguments.

Sample code:

f = h5py.File('test.h5', 'w')
f.create_dataset('lz4', data=numpy.arange(100),
    **hdf5plugin.LZ4(nbytes=0))
f.close()

Zfp()

This class returns the compression options to feed into h5py.Group.create_dataset for using the zfp filter:

It can be passed as keyword arguments.

Sample code:

f = h5py.File('test.h5', 'w')
f.create_dataset('zfp', data=numpy.random.random(100),
    **hdf5plugin.Zfp())
f.close()

The zfp filter compression mode is defined by the provided arguments. The following compression modes are supported:

  • Fixed-rate mode: For details, see zfp fixed-rate mode.

    f.create_dataset('zfp_fixed_rate', data=numpy.random.random(100),
        **hdf5plugin.Zfp(rate=10.0))
    
  • Fixed-precision mode: For details, see zfp fixed-precision mode.

    f.create_dataset('zfp_fixed_precision', data=numpy.random.random(100),
        **hdf5plugin.Zfp(precision=10))
    
  • Fixed-accuracy mode: For details, see zfp fixed-accuracy mode.

    f.create_dataset('zfp_fixed_accuracy', data=numpy.random.random(100),
        **hdf5plugin.Zfp(accuracy=0.001))
    
  • Reversible (i.e., lossless) mode: For details, see zfp reversible mode.

    f.create_dataset('zfp_reversible', data=numpy.random.random(100),
        **hdf5plugin.Zfp(reversible=True))
    
  • Expert mode: For details, see zfp expert mode.

    f.create_dataset('zfp_expert', data=numpy.random.random(100),
        **hdf5plugin.Zfp(minbits=1, maxbits=16657, maxprec=64, minexp=-1074))
    

Dependencies

Testing

To run self-contained tests, from Python:

import hdf5plugin.test
hdf5plugin.test.run_tests()

Or, from the command line:

python -m hdf5plugin.test

To also run tests relying on actual HDF5 files, run from the source directory:

python test/test.py

This tests the installed version of hdf5plugin.

License

The source code of hdf5plugin itself is licensed under the MIT license. Use it at your own risk. See LICENSE

The source code of the embedded HDF5 filter plugin libraries is licensed under different open-source licenses. Please read the different licenses:

The HDF5 v1.10.5 headers (and Windows .lib file) used to build the filters are stored for convenience in the repository. The license is available here: src/hdf5/COPYING.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for hdf5plugin, version 2.3.2
Filename, size File type Python version Upload date Hashes
Filename, size hdf5plugin-2.3.2-py2-none-win_amd64.whl (437.7 kB) File type Wheel Python version py2 Upload date Hashes View
Filename, size hdf5plugin-2.3.2-py2.py3-none-macosx_10_9_x86_64.whl (829.1 kB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size hdf5plugin-2.3.2-py2.py3-none-manylinux1_x86_64.whl (2.8 MB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size hdf5plugin-2.3.2-py2.py3-none-manylinux2014_ppc64le.whl (5.3 MB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size hdf5plugin-2.3.2-py2.py3-none-manylinux2014_x86_64.whl (5.4 MB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size hdf5plugin-2.3.2-py3-none-win_amd64.whl (476.4 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size hdf5plugin-2.3.2.tar.gz (11.1 MB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page