Skip to main content

Image pyramid generation specialized for connectomics data types and procedures.

Project description

Build Status PyPI version

tinybrain

Image pyramid generation specialized for connectomics data types and procedures. If your brain wasn't tiny before, it will be now.

import tinybrain 

img = load_3d_em_stack()

# factors (2,2), (2,2,1), and (2,2,1,1) are on a fast path
img_pyramid = tinybrain.downsample_with_averaging(img, factor=(2,2,1), num_mips=5)

labels = load_3d_labels()
label_pyramid = tinybrain.downsample_segmentation(labels, factor=(2,2,1), num_mips=5)

Installation

pip install numpy
pip install tinybrain

Motivation

Image heirarchy generation in connectomics uses a few different techniques for visualizing data, but predominantly we create image pyramids of uint8 grayscale images using 2x2 average pooling and of uint8 to uint64 segmentation labels using 2x2 mode pooling.

It's possible to compute both of these using numpy, however as multiple packages found it useful to copy the downsample functions, it makes sense to formalize these functions into a seperate library located on PyPI.

Given the disparate circumstances that they will be used in, these functions should work fast as possible with low memory usage and avoid numerical issues such as integer truncation while generating multiple mip levels.

Considerations: downsample_with_averaging

It's advisable to generate multiple mip levels at once rather than recursively computing new images as for integer type images, this leads to integer truncation issues. In the common case of 2x2x1 downsampling, a recursively computed image would lose 0.75 brightness per a mip level. Therefore, take advantage of the num_mips argument which strikes a balance that limits integer truncation loss to once every 4 mip levels. This compromise allows for the use of integer arithmatic and no more memory usage than 2x the input image including the output downsamples. If you seek to eliminate the loss beyond 4 mip levels, try promoting the type before downsampling.

A C++ high performance path is triggered for 2x2x1x1 downsample factors on uint8, uint16, float32, and float64 data types in Fortran order. Other factors, data types, and orderings are computed using a numpy pathway that is much slower and more memory intensive.

Example Benchmark

On a 1024x1024x100 uint8 image I ran the following code. PIL and OpenCV are actually much faster than this benchmark shows because most of the time is spent writing to the numpy array. tinybrain has a large advantage working on 3D and 4D arrays. Of course, this is a very simple benchmark and it may be possible to tune each of these approaches. On single slices, Pillow was faster than tinybrain.

img = np.load("image.npy")

s = time.time()
downsample_with_averaging(img, (2,2,1))
print("Original ", time.time() - s)

s = time.time()
out = tinybrain.downsample_with_averaging(img, (2,2,1))
print("tinybrain ", time.time() - s)

s = time.time()
out = np.zeros(shape=(512,512,100))
for z in range(img.shape[2]):
  out[:,:,z] = cv2.resize(img[:,:,z], dsize=(512, 512) )
print("OpenCV ", time.time() - s)

s = time.time()
out = np.zeros(shape=(512,512,100))
for z in range(img.shape[2]):
  pilimg = Image.fromarray(img[:,:,z])
  out[:,:,z] = pilimg.resize( (512, 512) )
print("Pillow ", time.time() - s)

# Method     Run Time             Rel. Perf.
# Original   1820 ms +/- 3.73 ms    1.0x
# tinybrain    67 ms +/- 0.40 ms   27.2x 
# OpenCV      469 ms +/- 1.12 ms    3.9x
# Pillow      937 ms +/- 7.63 ms    1.9x

Considerations: downsample_segmentation

The downsample_segmentation function performs mode pooling operations provided the downsample factor is a power of two, including in three dimensions. If the factor is a non-power of two, striding is used. The mode pooling, which is usually what you want, is computed recursively. Mode pooling is superior to striding, but the recursive calculatioon can introduce defects at mip levels higher than 1. This may be improved in the future.

The way the calculation is actually done uses an ensemble of several different methods. For (2,2,1,1) downsamples, a Cython fast, low memory path is selected that implements countless if. For (4,4,1) or other 2D powers of two, the countless 2d algorithm is used. For (2,2,2), (4,4,4), etc, the dynamic countless 3d algorithm is used. For 2D powers of two, stippled countless 2d is used if the sparse flag is enabled. For all other configurations, striding is used.

Countless 2d paths are also fast, but use slightly more memory and time. Countless 3D is okay for (2,2,2) and (4,4,4) but will use time and memory exponential in the product of dimensions. This state of affairs could be improved by implementing a counting based algorithm in Cython/C++ for arbitrary factors that doesn't compute recursively. The countless algorithms were developed before I knew how to write Cython and package libraries. However, C++ implementations of countless are much faster than counting for computing the first mip level. In particular, an AVX2 SIMD implementation can saturate memory bandwidth.

Documentation for the countless algorithm family is located here: https://github.com/william-silversmith/countless

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tinybrain-0.1.1.tar.gz (217.1 kB view details)

Uploaded Source

Built Distributions

tinybrain-0.1.1-cp38-cp38-manylinux1_x86_64.whl (839.0 kB view details)

Uploaded CPython 3.8

tinybrain-0.1.1-cp38-cp38-macosx_10_9_x86_64.whl (367.1 kB view details)

Uploaded CPython 3.8macOS 10.9+ x86-64

tinybrain-0.1.1-cp37-cp37m-manylinux1_x86_64.whl (803.7 kB view details)

Uploaded CPython 3.7m

tinybrain-0.1.1-cp37-cp37m-macosx_10_9_x86_64.whl (365.2 kB view details)

Uploaded CPython 3.7mmacOS 10.9+ x86-64

tinybrain-0.1.1-cp36-cp36m-manylinux1_x86_64.whl (804.7 kB view details)

Uploaded CPython 3.6m

tinybrain-0.1.1-cp36-cp36m-macosx_10_9_x86_64.whl (364.8 kB view details)

Uploaded CPython 3.6mmacOS 10.9+ x86-64

tinybrain-0.1.1-cp35-cp35m-manylinux1_x86_64.whl (795.2 kB view details)

Uploaded CPython 3.5m

tinybrain-0.1.1-cp35-cp35m-macosx_10_6_intel.whl (489.1 kB view details)

Uploaded CPython 3.5mmacOS 10.6+ Intel (x86-64, i386)

tinybrain-0.1.1-cp27-cp27m-manylinux1_x86_64.whl (790.3 kB view details)

Uploaded CPython 2.7m

tinybrain-0.1.1-cp27-cp27m-macosx_10_14_intel.whl (373.5 kB view details)

Uploaded CPython 2.7mmacOS 10.14+ Intel (x86-64, i386)

File details

Details for the file tinybrain-0.1.1.tar.gz.

File metadata

  • Download URL: tinybrain-0.1.1.tar.gz
  • Upload date:
  • Size: 217.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.1.tar.gz
Algorithm Hash digest
SHA256 8b0a4136da940c650f72386e5c2b185ac1af31584dd86ef4a72150de120cbff7
MD5 fa85545b5b235a7f5ce5d0aa7f6d3cc5
BLAKE2b-256 62834cc00ffee635915dfe8d78f789f24a894208d549456c434df2f92584f984

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 839.0 kB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.2

File hashes

Hashes for tinybrain-0.1.1-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 3226b4ab9f45a0c180194b8de745dc997bf923e3f3fcf715313dbbb199927061
MD5 3212903b721ca0bded4cd08cd943c6ee
BLAKE2b-256 a4934579d111f0ab885d6f8aa6c8b8d5d7dd03d2f8969d403c2ce3fcdec83159

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp38-cp38-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 367.1 kB
  • Tags: CPython 3.8, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.2

File hashes

Hashes for tinybrain-0.1.1-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 b7a48c927ad6a62096e4ec1a9c221c060c9a8cdae831253a6e3db7a23f8c44c4
MD5 bf43b0079f5e1c98581edf97a0b40304
BLAKE2b-256 4fe1e91bfd6e3bf4b4929696cdec8b21c618c6977a5c1f729a53e44f37b30211

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 803.7 kB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.1-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 f804c2af0e2c2c9cf08da1f25932bb3777cb0d5cce1355cd50f7a082783a08df
MD5 4de092b9146ecb54ebbcfdf752b9fbc2
BLAKE2b-256 43c82fd9cac3e9446fb7767a7618d42b322bb85239317f5b87c0f979292d6307

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 365.2 kB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.2

File hashes

Hashes for tinybrain-0.1.1-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 f948c01b3a7b5f618cb42cda0911372701c171a71d4d9fcab5244dd65a83b610
MD5 25f01805fa473bcc9679730e3195e861
BLAKE2b-256 de135f8c971ea244622264bb014caadf7138f06a4fe437f0b328677a48c8cb6c

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 804.7 kB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.1-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 6910e1b0ffd43150294fb5fba04e57a2bbb89bbb29fe298534542e9fc82e86e9
MD5 1b63c72b2ff5f6087ce52b56282d34dd
BLAKE2b-256 6c80833213e15dc192ddc627d49cba34cc52569c45f283188af8e2865996866e

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp36-cp36m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp36-cp36m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 364.8 kB
  • Tags: CPython 3.6m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.2

File hashes

Hashes for tinybrain-0.1.1-cp36-cp36m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 146d39b039c68e721443f02a7bc76cc57511957e74b033f8f83b95c2daa19dff
MD5 a90538b7213f4ec1fbd123a5ec0026a9
BLAKE2b-256 34b6beebc44815f9d0154005affc4c976f2d7c1b85c890f04116621e9d825b15

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 795.2 kB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.1-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 f0e1709bb3aa37f047cc7c0b91525274c269b42e200a1b5e339c4dc706ea0bff
MD5 45a09007321701778c8afd1352827d0e
BLAKE2b-256 36abac35f50add8414e2c18e6ca9fa637001c9c090abb20996c0e1c6eac50241

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp35-cp35m-macosx_10_6_intel.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp35-cp35m-macosx_10_6_intel.whl
  • Upload date:
  • Size: 489.1 kB
  • Tags: CPython 3.5m, macOS 10.6+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.2

File hashes

Hashes for tinybrain-0.1.1-cp35-cp35m-macosx_10_6_intel.whl
Algorithm Hash digest
SHA256 bac2c98e09d153f452515166f8f31bdfaed883b3162e556404ca70f928823192
MD5 21469fdfd5777c2d8fafa9441959b5cf
BLAKE2b-256 2931664ffc41b6ae38df1552a439f26fe927d29d110616458360f4c13e3230ff

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp27-cp27m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp27-cp27m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 790.3 kB
  • Tags: CPython 2.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.1-cp27-cp27m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 794d6f24883b2f271e72780aaea4ece1ef3b67502f88d27acdb3fbee2ab67369
MD5 b10966ec841535a0fddb199fbd0a1698
BLAKE2b-256 df478c557dd124f63dd8f04dec5612a67e2025fccd4dbf5ce8289b2dd2a3152b

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.1-cp27-cp27m-macosx_10_14_intel.whl.

File metadata

  • Download URL: tinybrain-0.1.1-cp27-cp27m-macosx_10_14_intel.whl
  • Upload date:
  • Size: 373.5 kB
  • Tags: CPython 2.7m, macOS 10.14+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.2

File hashes

Hashes for tinybrain-0.1.1-cp27-cp27m-macosx_10_14_intel.whl
Algorithm Hash digest
SHA256 c9bf38a74e44deb74f10836b370bedcb2141438e44832ec6e2902daaf1fda661
MD5 64cc2310e483c4c8aabacf0b5e0bb16d
BLAKE2b-256 6c9bc64722d7f74330ec2a04357b5b677600c135bf34ca2c28475ce6c75d6c7a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page