Skip to main content

Unofficial packaging of the compresso algorithm based on work by Matejek et al.

Project description

Compresso: Efficient Compression of Segmentation Data For Connectomics

*NOTE: This is an unofficial packaging of the work by Matejek et al. which can be found here: https://github.com/VCG/compresso *

Paper MICCAI doi

Segmentations

Recent advances in segmentation methods for connectomics and biomedical imaging produce very large datasets with labels that assign object classes to image pixels. The resulting label volumes are bigger than the raw image data and need compression for efficient storage and transfer. General-purpose compression methods are less effective because the label data consists of large low-frequency regions with structured boundaries unlike natural image data. We present Compresso, a new compression scheme for label data that outperforms existing approaches by using a sliding window to exploit redundancy across border regions in 2D and 3D. We compare our method to existing compression schemes and provide a detailed evaluation on eleven biomedical and image segmentation datasets. Our method provides a factor of 600-2200x compression for label volumes, with running times suitable for practice.

Paper: Matejek et al., "Compresso: Efficient Compression of Segmentation Data For Connectomics", Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI), 2017, 10-14. [CITE | PDF]

Requirements

  • Python 3.5+
  • conda

Setup

git clone https://github.com/vcg/compresso && cd compresso
conda create -n compresso_env --file requirements.txt -c chen -c sunpy -c conda-forge -c auto -c indygreg
source activate compresso_env
# for Compresso scheme as presented in MICCAI
cd experiments/compression/compresso; python setup.py build_ext --inplace
# to run the neuroglancer compression scheme
cd ../neuroglancer; python setup.py build_ext --inplace
# for Compresso v2 that is under development
cd ../../../src/python; python setup.py build_ext --inplace

Compress Segmentation Stacks

There are two versions of Compresso in this repository. Under the src folder there is an updated c++ and python version that extends on the Compresso scheme presented in MICCAI. This algorithm, among other things, implements bit-packing to further improve compression results.

The compression scheme in experiments/compression/compresso follows the MICCAI paper exactly.

Compress Your Segmentation Stack

In order to test Compresso on your own data simply use:

import compression as C
# With LZMA
C.LZMA.compress(C.COMPRESSO.compress(<NUMPY-3D-ARRAY>))

Experiments

# the dataset must be in hdf5 format.
experiments/run.py COMPRESSO LZMA ac3 -r 1 -s 1 -d '/<PATH>/<TO>/<DATA>'

Usage:

usage: run.py [-h] [--directory PATH] [--runs NUM] [--slices NUM]
              [--verbose]
              encoding compression dataset

positional arguments:
  encoding              name of encoding scheme
  compression           name of compression scheme
  dataset               name of data set

optional arguments:
  -h, --help            show this help message and exit
  --directory PATH, -d PATH
                        path to data directory
  --runs NUM, -r NUM    number of runs (default: 1)
  --slices NUM, -s NUM  number of slices per dataset (default: -1 (all))
  --verbose, -v         print progress (default: False) 

Make sure the data sets are located in ~/compresso/data/ or specify the location. The data from the paper can be found here:

Results From the Paper

Compression Performance

Compression Performance of Connectomics Datasets

Compression ratios of general-purpose compression methods combined with Compresso and Neuroglancer. Compresso paired with LZMA yields the best compression ratios for all connectomics datasets (left) and in average (four out of five) for the others (right).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

compresso-1.0.0.tar.gz (212.3 kB view hashes)

Uploaded Source

Built Distributions

compresso-1.0.0-cp38-cp38-manylinux2014_x86_64.whl (704.6 kB view hashes)

Uploaded CPython 3.8

compresso-1.0.0-cp38-cp38-manylinux1_x86_64.whl (511.9 kB view hashes)

Uploaded CPython 3.8

compresso-1.0.0-cp38-cp38-macosx_10_9_x86_64.whl (112.0 kB view hashes)

Uploaded CPython 3.8 macOS 10.9+ x86-64

compresso-1.0.0-cp37-cp37m-manylinux2014_x86_64.whl (609.7 kB view hashes)

Uploaded CPython 3.7m

compresso-1.0.0-cp37-cp37m-manylinux1_x86_64.whl (489.6 kB view hashes)

Uploaded CPython 3.7m

compresso-1.0.0-cp37-cp37m-macosx_10_9_x86_64.whl (110.0 kB view hashes)

Uploaded CPython 3.7m macOS 10.9+ x86-64

compresso-1.0.0-cp36-cp36m-manylinux2014_x86_64.whl (607.0 kB view hashes)

Uploaded CPython 3.6m

compresso-1.0.0-cp36-cp36m-manylinux1_x86_64.whl (491.6 kB view hashes)

Uploaded CPython 3.6m

compresso-1.0.0-cp36-cp36m-macosx_10_9_x86_64.whl (113.1 kB view hashes)

Uploaded CPython 3.6m macOS 10.9+ x86-64

compresso-1.0.0-cp35-cp35m-manylinux2014_x86_64.whl (599.9 kB view hashes)

Uploaded CPython 3.5m

compresso-1.0.0-cp35-cp35m-manylinux1_x86_64.whl (482.7 kB view hashes)

Uploaded CPython 3.5m

compresso-1.0.0-cp27-cp27m-manylinux1_x86_64.whl (475.0 kB view hashes)

Uploaded CPython 2.7m

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page