Skip to main content

Image pyramid generation specialized for connectomics data types and procedures.

Project description

[![Build Status](https://travis-ci.org/seung-lab/tinybrain.svg?branch=master)](https://travis-ci.org/seung-lab/tinybrain) [![PyPI version](https://badge.fury.io/py/tinybrain.svg)](https://badge.fury.io/py/tinybrain)

# tinybrain

Image pyramid generation specialized for connectomics data types and procedures. If your brain wasn't tiny before, it will be now.

```python
import tinybrain

img = load_3d_em_stack()

# factors (2,2), (2,2,1), and (2,2,1,1) are on a fast path
img_pyramid = tinybrain.downsample_with_averaging(img, factor=(2,2,1), num_mips=5)

labels = load_3d_labels()
label_pyramid = tinybrain.downsample_segmentation(labels, factor=(2,2,1), num_mips=5)
```

## Motivation

Image heirarchy generation in connectomics uses a few different techniques for
visualizing data, but predominantly we create image pyramids of uint8 grayscale
images using 2x2 average pooling and of uint8 to uint64 segmentation labels using
2x2 mode pooling.

It's possible to compute both of these using numpy, however as multiple packages found
it useful to copy the downsample functions, it makes sense to formalize these functions
into a seperate library located on PyPI.

Given the disparate circumstances that they will be used in, these functions should work
fast as possible with low memory usage and avoid numerical issues such as integer truncation
while generating multiple mip levels.

## Considerations: downsample_with_averaging

It's advisable to generate multiple mip levels at once rather than recursively computing
new images as for integer type images, this leads to integer truncation issues. In the common
case of 2x2x1 downsampling, a recursively computed image would lose 0.75 brightness per a
mip level. Therefore, take advantage of the `num_mips` argument which strikes a balance
that limits integer truncation loss to once every 4 mip levels. This compromise allows
for the use of integer arithmatic and no more memory usage than 2x the input image including
the output downsamples. If you seek to eliminate the loss beyond 4 mip levels, try promoting
the type before downsampling.

A C++ high performance path is triggered for 2x2x1x1 downsample factors on uint8, uint16, float32,
and float64 data types in Fortran order. Other factors, data types, and orderings are computed using a numpy pathway that is much slower and more memory intensive.


### Example Benchmark

On a 1024x1024x100 uint8 image I ran the following code. PIL and OpenCV are actually much faster than this benchmark shows because most of the time is spent writing to the numpy array. tinybrain has a large advantage working on 3D and 4D arrays. Of course, this is a very simple benchmark and it may be possible to tune each of these approaches. On single slices, Pillow was faster than tinybrain.

```python
img = np.load("image.npy")

s = time.time()
downsample_with_averaging(img, (2,2,1))
print("Original ", time.time() - s)

s = time.time()
out = tinybrain.downsample_with_averaging(img, (2,2,1))
print("tinybrain ", time.time() - s)

s = time.time()
out = np.zeros(shape=(512,512,100))
for z in range(img.shape[2]):
out[:,:,z] = cv2.resize(img[:,:,z], dsize=(512, 512) )
print("OpenCV ", time.time() - s)

s = time.time()
out = np.zeros(shape=(512,512,100))
for z in range(img.shape[2]):
pilimg = Image.fromarray(img[:,:,z])
out[:,:,z] = pilimg.resize( (512, 512) )
print("Pillow ", time.time() - s)

# Method Run Time Rel. Perf.
# Original 1.85 sec 1.0x
# tinybrain 0.09 sec 20.6x
# OpenCV 0.47 sec 3.9x
# Pillow 0.90 sec 2.1x
```

## Considerations: downsample_segmentation

To be continued.



Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tinybrain-0.0.1.tar.gz (218.2 kB view details)

Uploaded Source

Built Distributions

tinybrain-0.0.1-cp37-cp37m-manylinux1_x86_64.whl (836.6 kB view details)

Uploaded CPython 3.7m

tinybrain-0.0.1-cp36-cp36m-manylinux1_x86_64.whl (837.8 kB view details)

Uploaded CPython 3.6m

tinybrain-0.0.1-cp35-cp35m-manylinux1_x86_64.whl (820.9 kB view details)

Uploaded CPython 3.5m

tinybrain-0.0.1-cp27-cp27m-manylinux1_x86_64.whl (807.5 kB view details)

Uploaded CPython 2.7m

File details

Details for the file tinybrain-0.0.1.tar.gz.

File metadata

  • Download URL: tinybrain-0.0.1.tar.gz
  • Upload date:
  • Size: 218.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.0.1.tar.gz
Algorithm Hash digest
SHA256 125196eff2445c7a5b37e57b299b779b2b0a1b4abd04f4ffa4f970a02ffbfa6e
MD5 3eea4706997f52816443c2faaed6faa7
BLAKE2b-256 20ef053f0350946f7ff35675d5369a2aa1591f39615bbcf3e85b6e31209025ac

See more details on using hashes here.

File details

Details for the file tinybrain-0.0.1-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.0.1-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 836.6 kB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.0.1-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 abd5e7ef35b8abd5f110e6064025740eac44a15f66d3b6133789c707694a952b
MD5 9e4e0d9169d344b50ef831fa0a08c59c
BLAKE2b-256 1716690780bb0c66fcb127ae4b1eaa530eb6f9ade58b0d5f829bd39e05d8eb05

See more details on using hashes here.

File details

Details for the file tinybrain-0.0.1-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.0.1-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 837.8 kB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.0.1-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 230010a161db03d2ca32c560d7891b6a4dcf4fa5d6fd95957fded894174b7214
MD5 4b61f7054bcfa7d24776cb5f92dca1b8
BLAKE2b-256 2e483e6144cec836e41b5deb5153db92e82cdd1c7d2fea3590f7e4e0c6f158c3

See more details on using hashes here.

File details

Details for the file tinybrain-0.0.1-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.0.1-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 820.9 kB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.0.1-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 5d0c065dc5cef3c8957c14850d655baede4168c8455cba85efcc07b04dda5ac3
MD5 d32d1d6d7fce95eba56993d9b37b538c
BLAKE2b-256 b24f92b812cf7354637af9e30787222f9f7294491e0d50f86f1b689fc54e571e

See more details on using hashes here.

File details

Details for the file tinybrain-0.0.1-cp27-cp27m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.0.1-cp27-cp27m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 807.5 kB
  • Tags: CPython 2.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.0.1-cp27-cp27m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 a04de7654f5c66347c7470dd2678e4b1da779f85fbbdb589939e32f91f700fd4
MD5 269af3a4cb58405999b0e2e14ceaf67c
BLAKE2b-256 71f530cd0332096c5fb6caf68f80358d434193c2d23f389282466233c891b708

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page