Skip to main content

netCDF4 via h5py

Project description

https://github.com/h5netcdf/h5netcdf/workflows/CI/badge.svg https://badge.fury.io/py/h5netcdf.svg https://github.com/h5netcdf/h5netcdf/actions/workflows/pages/pages-build-deployment/badge.svg?branch=gh-pages

A Python interface for the netCDF4 file-format that reads and writes local or remote HDF5 files directly via h5py or h5pyd, without relying on the Unidata netCDF library.

Why h5netcdf?

  • It has one less binary dependency (netCDF C). If you already have h5py installed, reading netCDF4 with h5netcdf may be much easier than installing netCDF4-Python.

  • We’ve seen occasional reports of better performance with h5py than netCDF4-python, though in many cases performance is identical. For one workflow, h5netcdf was reported to be almost 4x faster than netCDF4-python.

  • Anecdotally, HDF5 users seem to be unexcited about switching to netCDF – hopefully this will convince them that netCDF4 is actually quite sane!

  • Finally, side-stepping the netCDF C library (and Cython bindings to it) gives us an easier way to identify the source of performance issues and bugs in the netCDF libraries/specification.

Install

Ensure you have a recent version of h5py installed (I recommend using conda or the community effort conda-forge). At least version 2.1 is required (for dimension scales); versions 2.3 and newer have been verified to work, though some tests only pass on h5py 2.6. Then:

$ pip install h5netcdf

Or if you are already using conda:

$ conda install h5netcdf

Usage

h5netcdf has two APIs, a new API and a legacy API. Both interfaces currently reproduce most of the features of the netCDF interface, with the notable exception of support for operations that rename or delete existing objects. We simply haven’t gotten around to implementing this yet. Patches would be very welcome.

New API

The new API supports direct hierarchical access of variables and groups. Its design is an adaptation of h5py to the netCDF data model. For example:

import h5netcdf
import numpy as np

with h5netcdf.File('mydata.nc', 'w') as f:
    # set dimensions with a dictionary
    f.dimensions = {'x': 5}
    # and update them with a dict-like interface
    # f.dimensions['x'] = 5
    # f.dimensions.update({'x': 5})

    v = f.create_variable('hello', ('x',), float)
    v[:] = np.ones(5)

    # you don't need to create groups first
    # you also don't need to create dimensions first if you supply data
    # with the new variable
    v = f.create_variable('/grouped/data', ('y',), data=np.arange(10))

    # access and modify attributes with a dict-like interface
    v.attrs['foo'] = 'bar'

    # you can access variables and groups directly using a hierarchical
    # keys like h5py
    print(f['/grouped/data'])

    # add an unlimited dimension
    f.dimensions['z'] = None
    # explicitly resize a dimension and all variables using it
    f.resize_dimension('z', 3)

Notes:

  • Automatic resizing of unlimited dimensions with array indexing is not available.

  • Dimensions need to be manually resized with Group.resize_dimension(dimension, size).

  • Arrays are returned padded with fillvalue (taken from underlying hdf5 dataset) up to current size of variable’s dimensions. The behaviour is equivalent to netCDF4-python’s Dataset.set_auto_mask(False).

Legacy API

The legacy API is designed for compatibility with netCDF4-python. To use it, import h5netcdf.legacyapi:

import h5netcdf.legacyapi as netCDF4
# everything here would also work with this instead:
# import netCDF4
import numpy as np

with netCDF4.Dataset('mydata.nc', 'w') as ds:
    ds.createDimension('x', 5)
    v = ds.createVariable('hello', float, ('x',))
    v[:] = np.ones(5)

    g = ds.createGroup('grouped')
    g.createDimension('y', 10)
    g.createVariable('data', 'i8', ('y',))
    v = g['data']
    v[:] = np.arange(10)
    v.foo = 'bar'
    print(ds.groups['grouped'].variables['data'])

The legacy API is designed to be easy to try-out for netCDF4-python users, but it is not an exact match. Here is an incomplete list of functionality we don’t include:

  • Utility functions chartostring, num2date, etc., that are not directly necessary for writing netCDF files.

  • h5netcdf variables do not support automatic masking or scaling (e.g., of values matching the _FillValue attribute). We prefer to leave this functionality to client libraries (e.g., xarray), which can implement their exact desired scaling behavior. Nevertheless arrays are returned padded with fillvalue (taken from underlying hdf5 dataset) up to current size of variable’s dimensions. The behaviour is equivalent to netCDF4-python’s Dataset.set_auto_mask(False).

Invalid netCDF files

h5py implements some features that do not (yet) result in valid netCDF files:

  • Data types:
    • Booleans

    • Complex values

    • Non-string variable length types

    • Enum types

    • Reference types

  • Arbitrary filters:
    • Scale-offset filters

By default [1], h5netcdf will not allow writing files using any of these features, as files with such features are not readable by other netCDF tools.

However, these are still valid HDF5 files. If you don’t care about netCDF compatibility, you can use these features by setting invalid_netcdf=True when creating a file:

# avoid the .nc extension for non-netcdf files
f = h5netcdf.File('mydata.h5', invalid_netcdf=True)
...

# works with the legacy API, too, though compression options are not exposed
ds = h5netcdf.legacyapi.Dataset('mydata.h5', invalid_netcdf=True)
...

In such cases the _NCProperties attribute will not be saved to the file or be removed from an existing file. A warning will be issued if the file has .nc-extension.

Footnotes

Decoding variable length strings

h5py 3.0 introduced new behavior for handling variable length string. Instead of being automatically decoded with UTF-8 into NumPy arrays of str, they are required as arrays of bytes.

The legacy API preserves the old behavior of h5py (which matches netCDF4), and automatically decodes strings.

The new API matches h5py behavior. Explicitly set decode_vlen_strings=True in the h5netcdf.File constructor to opt-in to automatic decoding.

Datasets with missing dimension scales

By default [2] h5netcdf raises a ValueError if variables with no dimension scale associated with one of their axes are accessed. You can set phony_dims='sort' when opening a file to let h5netcdf invent phony dimensions according to netCDF behaviour.

# mimic netCDF-behaviour for non-netcdf files
f = h5netcdf.File('mydata.h5', mode='r', phony_dims='sort')
...

Note, that this iterates once over the whole group-hierarchy. This has affects on performance in case you rely on laziness of group access. You can set phony_dims='access' instead to defer phony dimension creation to group access time. The created phony dimension naming will differ from netCDF behaviour.

f = h5netcdf.File('mydata.h5', mode='r', phony_dims='access')
...

Footnotes

Track Order

In h5netcdf version 0.12.0 and earlier, order tracking was disabled in HDF5 file. As this is a requirement for the current netCDF4 standard, it has been enabled without deprecation as of version 0.13.0 [*].

However in version 0.13.1 this has been reverted due to a bug in a core dependency of h5netcdf, h5py upstream bug.

Datasets created with h5netcdf version 0.12.0 that are opened with newer versions of h5netcdf will continue to disable order tracker.

Changelog

Changelog

License

3-clause BSD

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

h5netcdf-1.0.2.tar.gz (53.4 kB view details)

Uploaded Source

Built Distribution

h5netcdf-1.0.2-py2.py3-none-any.whl (24.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file h5netcdf-1.0.2.tar.gz.

File metadata

  • Download URL: h5netcdf-1.0.2.tar.gz
  • Upload date:
  • Size: 53.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.5

File hashes

Hashes for h5netcdf-1.0.2.tar.gz
Algorithm Hash digest
SHA256 8808a1e095f0122b4fb408cc98c60adf399bd57fef48d1ca7cdf4cda1d0a6b4a
MD5 fe3fb632835506db12f0e14e5b1992b5
BLAKE2b-256 5f3c71308369d52c5ab3395793a8e19220ee7586549ccd6b4ffca15b1a20a9bc

See more details on using hashes here.

File details

Details for the file h5netcdf-1.0.2-py2.py3-none-any.whl.

File metadata

  • Download URL: h5netcdf-1.0.2-py2.py3-none-any.whl
  • Upload date:
  • Size: 24.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.5

File hashes

Hashes for h5netcdf-1.0.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 4a4c277e18a906ab66e78ae2d95d27de4eec3519b87871adfd137974265bc250
MD5 3f54e4f9f28a66552555ad2886398925
BLAKE2b-256 fb81569ab60d33e51482192a278ee3457704805470d8fe0d757237cf02f01c53

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page