Skip to main content

Parallel random access to gzip files

Project description

indexed_bzip2

PyPI version Python Version PyPI Platforms Conda Platforms Downloads License Build Status codecov C++17

This module provides an IndexedBzip2File class, which can be used to seek inside bzip2 files without having to decompress them first. Alternatively, you can use this simply as a parallelized bzip2 decoder as a replacement for Python's builtin bz2 module in order to fully utilize all your cores.

On a 12-core processor, this can lead to a speedup of 6 over Python's bz2 module, see this example. Note that without parallelization, indexed_bzip2 is unfortunately slower than Python's bz2 module. Therefore, it is not recommended when neither seeking nor parallelization is used!

The internals are based on an improved version of the bzip2 decoder bzcat from toybox, which was refactored and extended to be able to export and import bzip2 block offsets, seek to block offsets, and to add support for threaded parallel decoding of blocks.

Seeking inside a block is only emulated, so IndexedBzip2File will only speed up seeking when there are more than one block, which should almost always be the cause for archives larger than 1 MB.

Since version 1.2.0, parallel decoding of blocks is supported! However, per default, the older serial implementation is used. To use the parallel implementation you need to specify a parallelization argument other than 1 to IndexedBzip2File, see e.g. this example.

Table of Contents

  1. Installation
  2. Usage
    1. Python Library
    2. Via Ratarmount
    3. Command Line Tool
    4. C++ Library
  3. Performance comparison with bz2 module
  4. Internal Architecture
  5. Tracing the Decoder

Installation

You can simply install it from PyPI:

python3 -m pip install --upgrade pip  # Recommended for newer manylinux wheels
python3 -m pip install indexed_bzip2

Usage

Python Library

Simple open, seek, read, and close

from indexed_bzip2 import IndexedBzip2File

file = IndexedBzip2File( "example.bz2", parallelization = os.cpu_count() )

# You can now use it like a normal file
file.seek( 123 )
data = file.read( 100 )
file.close()

The first call to seek will ensure that the block offset list is complete and therefore might create them first. Because of this the first call to seek might take a while.

Use with context manager

import os
import indexed_bzip2 as ibz2

with ibz2.open( "example.bz2", parallelization = os.cpu_count() ) as file:
    file.seek( 123 )
    data = file.read( 100 )

Storing and loading the block offset map

The creation of the list of bzip2 blocks can take a while because it has to decode the bzip2 file completely. To avoid this setup when opening a bzip2 file, the block offset list can be exported and imported.

import indexed_bzip2 as ibz2
import pickle

# Calculate and save bzip2 block offsets
file = ibz2.open( "example.bz2", parallelization = os.cpu_count() )
block_offsets = file.block_offsets() # can take a while
# block_offsets is a simple dictionary where the keys are the bzip2 block offsets in bits(!)
# and the values are the corresponding offsets in the decoded data in bytes. E.g.:
# block_offsets = {32: 0, 14920: 4796}
with open( "offsets.dat", 'wb' ) as offsets_file:
    pickle.dump( block_offsets, offsets_file )
file.close()

# Load bzip2 block offsets for fast seeking
with open( "offsets.dat", 'rb' ) as offsets_file:
    block_offsets = pickle.load( offsets_file )
file2 = ibz2.open( "example.bz2", parallelization = os.cpu_count() )
file2.set_block_offsets( block_offsets ) # should be fast
file2.seek( 123 )
data = file2.read( 100 )
file2.close()

Open a pure Python file-like object for indexed reading

import io
import os
import indexed_bzip2 as ibz2

with open( "example.bz2", 'rb' ) as file:
    in_memory_file = io.BytesIO( file.read() )

with ibz2.open( in_memory_file, parallelization = os.cpu_count() ) as file:
    file.seek( 123 )
    data = file.read( 100 )

Via Ratarmount

Because indexed_bzip2 is used by default as a backend in ratarmount, you can use ratarmount to mount single bzip2 files easily. Furthermore, since ratarmount 0.11.0, parallelization is the default and does not have to be specified explicitly with -P.

base64 /dev/urandom | head -c $(( 512 * 1024 * 1024 )) | bzip2 > sample.bz2
# Serial decoding: 23 s
time bzip2 -c -d sample.bz2 | wc -c

python3 -m pip install --user ratarmount
ratarmount sample.bz2 mounted

# Parallel decoding: 2 s
time cat mounted/sample | wc -c

# Random seeking to the middle of the file and reading 1 MiB: 0.132 s
time dd if=mounted/sample bs=$(( 1024 * 1024 )) \
       iflag=skip_bytes,count_bytes skip=$(( 256 * 1024 * 1024 )) count=$(( 1024 * 1024 )) | wc -c

Command Line Tool

A rudimentary command line tool exists but is not yet shipped with the Python module and instead has to be built from source.

git clone https://github.com/mxmlnkn/indexed_bzip2.git
cd indexed_bzip2
mkdir build
cd build
cmake ..
cmake --build . -- ibzip2

The finished ibzip2 binary is created in the tools subfolder. To install it, it can be copied, e.g., to /usr/local/bin or anywhere else as long as it is available in your PATH variable. The command line options are similar to those of the existing bzip2 tool.

tools/ibzip2 --help

# Parallel decoding: 1.7 s
time tools/ibzip2 -d -c -P 0 sample.bz2 | wc -c

# Serial decoding: 22 s
time bzip2 -d -c sample.bz2 | wc -c

C++ library

Because it is written in C++, it can of course also be used as a C++ library. In order to make heavy use of templates and to simplify compiling with Python setuptools, it is completely header-only so that integration it into another project should be easy. The license is also permissive enough for most use cases.

I currently did not yet test integrating it into other projects other than simply manually copying the source in core and indexed_bzip2. If you have suggestions and wishes like support with CMake or Conan, please open an issue.

Performance comparison with bz2 module

These are simple timing tests for reading all the contents of a bzip2 file sequentially.

import bz2
import time

with bz2.open( bz2FilePath ) as file:
    t0 = time.time()
    while file.read( 4*1024*1024 ):
        pass
    t1 = time.time()
    print( f"Decoded file in {t1-t0}s" )

The usage of indexed_bzip2 is slightly different:

import indexed_bzip2
import time

# parallelization = 0 means that it is automatically using all available cores.
with indexed_bzip2.IndexedBzip2File( bz2FilePath, parallelization = 0 ) as file:
    t0 = time.time()
    while file.read( 4*1024*1024 ):
        pass
    t1 = time.time()
    print( f"Decoded file in {t1-t0}s" )

Results for an AMD Ryzen 3900X 12-core (24 virtual cores) processor and with bz2FilePath=CTU-13-Dataset.tar.bz2, which is a 2GB bz2 compressed archive.

Module Runtime / s
bz2 414
indexed_bzip2 with parallelization = 0 69
indexed_bzip2 with parallelization = 1 566
indexed_bzip2 with parallelization = 2 315
indexed_bzip2 with parallelization = 6 123
indexed_bzip2 with parallelization = 12 79
indexed_bzip2 with parallelization = 24 70
indexed_bzip2 with parallelization = 32 69

The speedup between the bz2 module and indexed_bzip2 with parallelization = 0 is 414/69 = 6. When using only one core, indexed_bzip2 is slower by (566-414)/414 = 37%.

Internal Architecture

The parallelization of the bzip2 decoder and adding support to read from Python file-like objects required a lot of work to design an architecture which works and can be reasoned about. An earlier architecture was discarded because it became to monolithic. That discarded one was able to even work with piped non-seekable input, with which the current parallel architecture does not work with yet. The serial BZ2Reader still exists but is not shown in the class diagram because it is deprecated and will be removed some time in the future after the ParallelBZ2Reader has proven itself. Click here or the image to get a larger image and here to see an SVG version.

Class Diagram for ParallelBZ2Reader

Tracing the Decoder

Performance profiling and tracing is done with Score-P for instrumentation and Vampir for visualization. This is one way, you could install Score-P with most of the functionalities on Debian 10.

# Install PAPI
wget http://icl.utk.edu/projects/papi/downloads/papi-5.7.0.tar.gz
tar -xf papi-5.7.0.tar.gz
cd papi-5.7.0/src
./configure
make -j
sudo make install

# Install Dependencies
sudo apt-get install libopenmpi-dev openmpi gcc-8-plugin-dev llvm-dev libclang-dev libunwind-dev libopen-trace-format-dev otf-trace

# Install Score-P (to /opt/scorep)
wget https://www.vi-hps.org/cms/upload/packages/scorep/scorep-6.0.tar.gz
tar -xf scorep-6.0.tar.gz
cd scorep-6.0
./configure --with-mpi=openmpi --enable-shared
make -j
make install

# Add /opt/scorep to your path variables on shell start
cat <<EOF >> ~/.bashrc
if test -d /opt/scorep; then
    export SCOREP_ROOT=/opt/scorep
    export PATH=$SCOREP_ROOT/bin:$PATH
    export LD_LIBRARY_PATH=$SCOREP_ROOT/lib:$LD_LIBRARY_PATH
fi
EOF

# Check whether it works
scorep --version
scorep-info config-summary

# Actually do the tracing
cd tools
bash trace.sh

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

pragzip-0.1.0-pp39-pypy39_pp73-win_amd64.whl (242.0 kB view hashes)

Uploaded PyPy Windows x86-64

pragzip-0.1.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (495.9 kB view hashes)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

pragzip-0.1.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl (525.1 kB view hashes)

Uploaded PyPy manylinux: glibc 2.17+ i686

pragzip-0.1.0-pp38-pypy38_pp73-win_amd64.whl (242.4 kB view hashes)

Uploaded PyPy Windows x86-64

pragzip-0.1.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (496.0 kB view hashes)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

pragzip-0.1.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl (525.2 kB view hashes)

Uploaded PyPy manylinux: glibc 2.17+ i686

pragzip-0.1.0-pp37-pypy37_pp73-win_amd64.whl (242.4 kB view hashes)

Uploaded PyPy Windows x86-64

pragzip-0.1.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (498.5 kB view hashes)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

pragzip-0.1.0-pp37-pypy37_pp73-manylinux_2_17_i686.manylinux2014_i686.whl (529.3 kB view hashes)

Uploaded PyPy manylinux: glibc 2.17+ i686

pragzip-0.1.0-cp310-cp310-win_amd64.whl (246.8 kB view hashes)

Uploaded CPython 3.10 Windows x86-64

pragzip-0.1.0-cp310-cp310-win32.whl (234.1 kB view hashes)

Uploaded CPython 3.10 Windows x86

pragzip-0.1.0-cp310-cp310-musllinux_1_1_x86_64.whl (2.5 MB view hashes)

Uploaded CPython 3.10 musllinux: musl 1.1+ x86-64

pragzip-0.1.0-cp310-cp310-musllinux_1_1_i686.whl (2.5 MB view hashes)

Uploaded CPython 3.10 musllinux: musl 1.1+ i686

pragzip-0.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.1 MB view hashes)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

pragzip-0.1.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl (2.1 MB view hashes)

Uploaded CPython 3.10 manylinux: glibc 2.17+ i686

pragzip-0.1.0-cp39-cp39-win_amd64.whl (248.0 kB view hashes)

Uploaded CPython 3.9 Windows x86-64

pragzip-0.1.0-cp39-cp39-win32.whl (234.8 kB view hashes)

Uploaded CPython 3.9 Windows x86

pragzip-0.1.0-cp39-cp39-musllinux_1_1_x86_64.whl (2.5 MB view hashes)

Uploaded CPython 3.9 musllinux: musl 1.1+ x86-64

pragzip-0.1.0-cp39-cp39-musllinux_1_1_i686.whl (2.5 MB view hashes)

Uploaded CPython 3.9 musllinux: musl 1.1+ i686

pragzip-0.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.1 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

pragzip-0.1.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl (2.1 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.17+ i686

pragzip-0.1.0-cp38-cp38-win_amd64.whl (248.0 kB view hashes)

Uploaded CPython 3.8 Windows x86-64

pragzip-0.1.0-cp38-cp38-win32.whl (234.8 kB view hashes)

Uploaded CPython 3.8 Windows x86

pragzip-0.1.0-cp38-cp38-musllinux_1_1_x86_64.whl (2.5 MB view hashes)

Uploaded CPython 3.8 musllinux: musl 1.1+ x86-64

pragzip-0.1.0-cp38-cp38-musllinux_1_1_i686.whl (2.5 MB view hashes)

Uploaded CPython 3.8 musllinux: musl 1.1+ i686

pragzip-0.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.1 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

pragzip-0.1.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl (2.1 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.17+ i686

pragzip-0.1.0-cp37-cp37m-win_amd64.whl (247.5 kB view hashes)

Uploaded CPython 3.7m Windows x86-64

pragzip-0.1.0-cp37-cp37m-win32.whl (234.3 kB view hashes)

Uploaded CPython 3.7m Windows x86

pragzip-0.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl (2.5 MB view hashes)

Uploaded CPython 3.7m musllinux: musl 1.1+ x86-64

pragzip-0.1.0-cp37-cp37m-musllinux_1_1_i686.whl (2.5 MB view hashes)

Uploaded CPython 3.7m musllinux: musl 1.1+ i686

pragzip-0.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.1 MB view hashes)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

pragzip-0.1.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl (2.1 MB view hashes)

Uploaded CPython 3.7m manylinux: glibc 2.17+ i686

pragzip-0.1.0-cp36-cp36m-win_amd64.whl (247.5 kB view hashes)

Uploaded CPython 3.6m Windows x86-64

pragzip-0.1.0-cp36-cp36m-win32.whl (234.3 kB view hashes)

Uploaded CPython 3.6m Windows x86

pragzip-0.1.0-cp36-cp36m-musllinux_1_1_x86_64.whl (2.5 MB view hashes)

Uploaded CPython 3.6m musllinux: musl 1.1+ x86-64

pragzip-0.1.0-cp36-cp36m-musllinux_1_1_i686.whl (2.5 MB view hashes)

Uploaded CPython 3.6m musllinux: musl 1.1+ i686

pragzip-0.1.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.1 MB view hashes)

Uploaded CPython 3.6m manylinux: glibc 2.17+ x86-64

pragzip-0.1.0-cp36-cp36m-manylinux_2_17_i686.manylinux2014_i686.whl (2.1 MB view hashes)

Uploaded CPython 3.6m manylinux: glibc 2.17+ i686

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page