Skip to main content

Image pyramid generation specialized for connectomics data types and procedures.

Project description

LICENSE
Description: [![Build Status](https://travis-ci.org/seung-lab/tinybrain.svg?branch=master)](https://travis-ci.org/seung-lab/tinybrain) [![PyPI version](https://badge.fury.io/py/tinybrain.svg)](https://badge.fury.io/py/tinybrain)

# tinybrain

Image pyramid generation specialized for connectomics data types and procedures. If your brain wasn't tiny before, it will be now.

```python
import tinybrain

img = load_3d_em_stack()

# factors (2,2), (2,2,1), and (2,2,1,1) are on a fast path
img_pyramid = tinybrain.downsample_with_averaging(img, factor=(2,2,1), num_mips=5)

labels = load_3d_labels()
label_pyramid = tinybrain.downsample_segmentation(labels, factor=(2,2,1), num_mips=5)
```

## Installation

```bash
pip install numpy
pip install tinybrain
```

## Motivation

Image heirarchy generation in connectomics uses a few different techniques for
visualizing data, but predominantly we create image pyramids of uint8 grayscale
images using 2x2 average pooling and of uint8 to uint64 segmentation labels using
2x2 mode pooling.

It's possible to compute both of these using numpy, however as multiple packages found
it useful to copy the downsample functions, it makes sense to formalize these functions
into a seperate library located on PyPI.

Given the disparate circumstances that they will be used in, these functions should work
fast as possible with low memory usage and avoid numerical issues such as integer truncation
while generating multiple mip levels.

## Considerations: downsample_with_averaging

It's advisable to generate multiple mip levels at once rather than recursively computing
new images as for integer type images, this leads to integer truncation issues. In the common
case of 2x2x1 downsampling, a recursively computed image would lose 0.75 brightness per a
mip level. Therefore, take advantage of the `num_mips` argument which strikes a balance
that limits integer truncation loss to once every 4 mip levels. This compromise allows
for the use of integer arithmatic and no more memory usage than 2x the input image including
the output downsamples. If you seek to eliminate the loss beyond 4 mip levels, try promoting
the type before downsampling.

A C++ high performance path is triggered for 2x2x1x1 downsample factors on uint8, uint16, float32,
and float64 data types in Fortran order. Other factors, data types, and orderings are computed using a numpy pathway that is much slower and more memory intensive.


### Example Benchmark

On a 1024x1024x100 uint8 image I ran the following code. PIL and OpenCV are actually much faster than this benchmark shows because most of the time is spent writing to the numpy array. tinybrain has a large advantage working on 3D and 4D arrays. Of course, this is a very simple benchmark and it may be possible to tune each of these approaches. On single slices, Pillow was faster than tinybrain.

```python
img = np.load("image.npy")

s = time.time()
downsample_with_averaging(img, (2,2,1))
print("Original ", time.time() - s)

s = time.time()
out = tinybrain.downsample_with_averaging(img, (2,2,1))
print("tinybrain ", time.time() - s)

s = time.time()
out = np.zeros(shape=(512,512,100))
for z in range(img.shape[2]):
out[:,:,z] = cv2.resize(img[:,:,z], dsize=(512, 512) )
print("OpenCV ", time.time() - s)

s = time.time()
out = np.zeros(shape=(512,512,100))
for z in range(img.shape[2]):
pilimg = Image.fromarray(img[:,:,z])
out[:,:,z] = pilimg.resize( (512, 512) )
print("Pillow ", time.time() - s)

# Method Run Time Rel. Perf.
# Original 1820 ms +/- 3.73 ms 1.0x
# tinybrain 67 ms +/- 0.40 ms 27.2x
# OpenCV 469 ms +/- 1.12 ms 3.9x
# Pillow 937 ms +/- 7.63 ms 1.9x
```

## Considerations: downsample_segmentation

To be continued.


Platform: UNKNOWN
Classifier: Intended Audience :: Developers
Classifier: Development Status :: 4 - Beta
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Topic :: Scientific/Engineering

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tinybrain-0.1.0.tar.gz (215.9 kB view details)

Uploaded Source

Built Distributions

tinybrain-0.1.0-cp37-cp37m-manylinux1_x86_64.whl (807.4 kB view details)

Uploaded CPython 3.7m

tinybrain-0.1.0-cp37-cp37m-macosx_10_9_x86_64.whl (367.0 kB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

tinybrain-0.1.0-cp36-cp36m-manylinux1_x86_64.whl (808.6 kB view details)

Uploaded CPython 3.6m

tinybrain-0.1.0-cp35-cp35m-manylinux1_x86_64.whl (795.3 kB view details)

Uploaded CPython 3.5m

tinybrain-0.1.0-cp27-cp27m-manylinux1_x86_64.whl (782.0 kB view details)

Uploaded CPython 2.7m

File details

Details for the file tinybrain-0.1.0.tar.gz.

File metadata

  • Download URL: tinybrain-0.1.0.tar.gz
  • Upload date:
  • Size: 215.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.0.tar.gz
Algorithm Hash digest
SHA256 91a4dc14b40b3d606523e3f45aa37b712aa84e2bc5839708217bef5ce972b99a
MD5 b403fb01c1784b95c91228b12287de74
BLAKE2b-256 d13943a49368de6daefa4be139a9b7a04cadab1fe21922bf84537a2c85fa7d61

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.0-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.0-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 807.4 kB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.0-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 7afc856c5fc272353e55c7a6f5ebee9369a8f3b7a7a83f8d743a9a5ba4446392
MD5 554d062a0ffacccc5301a5f8c56b1e55
BLAKE2b-256 6f4b271db698bb4e7d7b7a4d1cc83f6c5c2cc51de54fc0ca3b9fe30deb3db203

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.0-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.0-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 367.0 kB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.2

File hashes

Hashes for tinybrain-0.1.0-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 3a24161cdfb837e1fef9e902f78b81f4262a053d575e9cf82a9ab1858814b3ac
MD5 50ec9ba4512c96eb0222f73f2fb1fb2a
BLAKE2b-256 adc7bda99312c10082832c5bc96c78422738a25e1ba4ba638281f239e5efa000

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.0-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.0-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 808.6 kB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.0-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 61b1c4b4b9a2ba4f785ad79174c669be7f253b90e85d0740531696984e83c803
MD5 877feaf5ce69a15dcfbe638d40a42efd
BLAKE2b-256 35005a39b9203ee9e8bdba3d502333b650501988c81588c843ffddf7012a0725

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.0-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.0-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 795.3 kB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.0-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 8e32284f19a797d870860e75f447947bf61f44f338535eb78df10c49efd1fc14
MD5 4c5fad7f5d4453ed49566906d197638a
BLAKE2b-256 bbfca5e12308abda2afb07d10163c9572f41896172fcbd9ac7d7e0895d629f40

See more details on using hashes here.

File details

Details for the file tinybrain-0.1.0-cp27-cp27m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tinybrain-0.1.0-cp27-cp27m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 782.0 kB
  • Tags: CPython 2.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for tinybrain-0.1.0-cp27-cp27m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 c1e37065b994c61a4daab5e8029d439e8cc0195228015cb8060a018bb67054ae
MD5 24997a79329fe87060fff623b156fdc6
BLAKE2b-256 8b7f577d85f6362f5597d279ce029301b01374e1ce6c8180acd6d9e74b8dd161

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page