Skip to main content

High performance parallel reading of HDF5 files using PyTables, multiprocessing, and shared memory.

Project description

multitables is a python library designed for high speed access to HDF5 files. Access to HDF5 is provided by the PyTables library (tables). Multiple processes are launched to read a HDF5 in parallel, allowing concurrent decompression. Data is streamed back to the invoker by use of shared memory space, removing the usual multiprocessing communication overhead.

The data is organised by rows of an array (elements of the outer-most dimension), and groups of these rows form blocks. With the Streamer interface, there is no guarantee on the ordering of the rows and/or blocks returned to the user, due to the concurrent nature of the library. They are returned as they become available. On-disk ordering can be forced using the ordered option, which may result in a performance penalty.

The Reader interface allows random access, which gives fine grained control over the ordering of requests.

Performance gains of at least 2x can be achieved when reading from an SSD.

New with version 2

Random access reads are now possible through asynchronous requests. The results of these requests are placed in shared memory. See the how-to and unit tests for examples of the new interface.

Licence

This software is distributed under the MIT licence. See the LICENSE.txt file for details.

Installation

pip install multitables

multitables depends on tables (the pytables package), numpy, msgpack, and wrapt. The package is compatible with the latest versions of Python 3, as pytables no longer supports Python 2.

Quick start: Streaming

import multitables
stream = multitables.Streamer(filename='/path/to/h5/file')
for row in stream.get_generator(path='/internal/h5/path'):
    do_something(row)

Quick start: Random access

import multitables
reader = multitables.Reader(filename='/path/to/h5/file')

dataset = reader.get_dataset(path='/internal/h5/path')
stage = dataset.create_stage(10) # Size of the shared
                                    # memory stage in rows

req = dataset['col_A'][30:35] # Create a request as you
                                 # would index normally.

future = reader.request(req, stage) # Schedule the request
with future.get_unsafe() as data:
    do_something(data)
data = None # Always set data to None after get_unsafe to
            # prevent a dangling reference

# ... or use a safer proxy method

req = dataset.col('col_A')[30:35,...,:100]

future = reader.request(req, stage)
with future.get_proxy() as data:
    do_something(data)

# ... or get a copy of the data

req = dataset['col_A'][30:35,np.arange(500) > 45]

future = reader.request(req, stage)
do_something(future.get())

Examples

See the how-to for more in-depth documentation, and the unit tests for complete examples.

Documentation

Online documentation is available. A how to gives a basic overview of the library. A benchmark tests the speed of the library using various compression algorithms and hardware configurations.

Offline documentation can be built from the docs folder using sphinx.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

multitables-2.0.1-py2.py3-none-any.whl (34.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file multitables-2.0.1-py2.py3-none-any.whl.

File metadata

  • Download URL: multitables-2.0.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 34.8 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.5

File hashes

Hashes for multitables-2.0.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 101b320cf2bac7a742f3288648693ef6c8a41e6ac8c6ba11030b1f5e0bf86878
MD5 0b941167a2edc5857ffcdbbc733bcf7a
BLAKE2b-256 1f753aedbdbe239a3c11c36fae496c51d8f6237938b78c9d0975dabec0a64108

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page