Skip to main content

Image pyramid generation specialized for connectomics data types and procedures.

Project description

PyPI version

tinybrain

Image pyramid generation specialized for connectomics data types and procedures. If your brain wasn't tiny before, it will be now.

import tinybrain 

img = load_3d_em_stack()

# 2x2 and 2x2x2 downsamples are on a fast path.
# e.g. (2,2), (2,2,1), (2,2,1,1), (2,2,2), (2,2,2,1)
img_pyramid = tinybrain.downsample_with_averaging(img, factor=(2,2,1), num_mips=5, sparse=False)

labels = load_3d_labels()
label_pyramid = tinybrain.downsample_segmentation(labels, factor=(2,2,1), num_mips=5, sparse=False)

# We also have a few other types
img_pyramid = tinybrain.downsample_with_min_pooling(image, factor=(2,2,1), num_mips=5)
img_pyramid = tinybrain.downsample_with_max_pooling(image, factor=(2,2,1), num_mips=5)
img_pyramid = tinybrain.downsample_with_striding(image, factor=(2,2,1), num_mips=5)

Installation

pip install numpy
pip install tinybrain

Motivation

Image hierarchy generation in connectomics uses a few different techniques for visualizing data, but predominantly we create image pyramids of uint8 grayscale images using 2x2 average pooling and of uint8 to uint64 segmentation labels using 2x2 mode pooling. When images become very large and people wish to visualize upper mip levels using three axes at once, it becomes desirable to perform 2x2x2 downsamples to maintain isotropy.

It's possible to compute both of these using numpy, however as multiple packages found it useful to copy the downsample functions, it makes sense to formalize these functions into a seperate library located on PyPI.

Given the disparate circumstances that they will be used in, these functions should work fast as possible with low memory usage and avoid numerical issues such as integer truncation while generating multiple mip levels.

Considerations: downsample_with_averaging

It's advisable to generate multiple mip levels at once rather than recursively computing new images as for integer type images, this leads to integer truncation issues. In the common case of 2x2x1 downsampling, a recursively computed image would lose 0.75 brightness per a mip level. Therefore, take advantage of the num_mips argument which strikes a balance that limits integer truncation loss to once every 4 mip levels. This compromise allows for the use of integer arithmatic and no more memory usage than 2x the input image including the output downsamples. If you seek to eliminate the loss beyond 4 mip levels, try promoting the type before downsampling. 2x2x2x1 downsamples truncate every 8 mip levels.

A C++ high performance path is triggered for 2x2x1x1 and 2x2x2x1 downsample factors on uint8, uint16, float32, and float64 data types in Fortran order. Other factors, data types, and orderings are computed using a numpy pathway that is much slower and more memory intensive.

We also include a sparse mode for downsampling 2x2x2 patches, which prevents "ghosting" where one z-slice overlaps a black region on the next slice and becomes semi-transparent after downsampling. We deal with this by neglecting the background pixels from the averaging operation.

Example Benchmark

On a 1024x1024x100 uint8 image I ran the following code. PIL and OpenCV are actually much faster than this benchmark shows because most of the time is spent writing to the numpy array. tinybrain has a large advantage working on 3D and 4D arrays. Of course, this is a very simple benchmark and it may be possible to tune each of these approaches. On single slices, Pillow was faster than tinybrain.

img = np.load("image.npy")

s = time.time()
downsample_with_averaging(img, (2,2,1))
print("Original ", time.time() - s)

s = time.time()
out = tinybrain.downsample_with_averaging(img, (2,2,1))
print("tinybrain ", time.time() - s)

s = time.time()
out = np.zeros(shape=(512,512,100))
for z in range(img.shape[2]):
  out[:,:,z] = cv2.resize(img[:,:,z], dsize=(512, 512) )
print("OpenCV ", time.time() - s)

s = time.time()
out = np.zeros(shape=(512,512,100))
for z in range(img.shape[2]):
  pilimg = Image.fromarray(img[:,:,z])
  out[:,:,z] = pilimg.resize( (512, 512) )
print("Pillow ", time.time() - s)

# Method     Run Time             Rel. Perf.
# Original   1820 ms +/- 3.73 ms    1.0x
# tinybrain    67 ms +/- 0.40 ms   27.2x 
# OpenCV      469 ms +/- 1.12 ms    3.9x
# Pillow      937 ms +/- 7.63 ms    1.9x

Here's the output from perf.py on an Apple Silicon 2021 Macbook Pro M1. Note that the image used was a random 2048x2048x64 array that was a uint8 for average pooling and a uint64 for mode pooling to represent real use cases more fairly. In the table, read it as 2D or 3D downsamples, generating a single or multiple mip levels, with sparse mode enabled or disabled. The speed values are in megavoxels per a second and are the mean of ten runs.

dwnsmpl mips sparse AVG (MVx/sec) MODE (MVx/sec)
2x2 1 N 3856.07 1057.87
2x2 2 N 2685.80 1062.69
2x2 1 Y N/A 129.64
2x2 2 Y N/A 81.62
2x2x2 1 N 4468.55 336.85
2x2x2 2 N 2867.80 298.45
2x2x2 1 Y 1389.47 337.87
2x2x2 2 Y 1259.58 293.84

As the downsampling code's performance is data dependent due to branching, I also used connectomics.npy (5123 uint32 extended to uint64) to see how that affected performance. This data comes from mouse visual cortex and has many equal adjacent voxels. In this volume, the 2x2x2 non-sparse mode is much faster as the "instant" majority detection can skip examining half the voxels in many cases.

dwnsmpl mips sparse MODE (MVx/sec)
2x2 1 N 1078.09
2x2 2 N 1030.90
2x2 1 Y 146.15
2x2 2 Y 69.25
2x2x2 1 N 1966.74
2x2x2 2 N 1790.60
2x2x2 1 Y 2041.96
2x2x2 2 Y 1758.42

Considerations: downsample_segmentation

The downsample_segmentation function performs mode pooling operations provided the downsample factor is a power of two, including in three dimensions. If the factor is a non-power of two, striding is used. The mode pooling, which is usually what you want, is computed recursively. Mode pooling is superior to striding, but the recursive calculation can introduce defects at mip levels higher than 1. This may be improved in the future.

The way the calculation is actually done uses an ensemble of several different methods. For (2,2,1,1) and (2,2,2,1) downsamples, a Cython fast, low memory path is selected. (2,2,1,1) implements countless if. (2,2,2,1) uses a combination of counting and "instant" majority detection. For (4,4,1) or other 2D powers of two, the countless 2d algorithm is used. For (4,4,4), (8,8,8) etc, the dynamic countless 3d algorithm is used. For 2D powers of two, stippled countless 2d is used if the sparse flag is enabled. For all other configurations, striding is used.

Countless 2d paths are also fast, but use slightly more memory and time. Countless 3D is okay for (2,2,2) and (4,4,4) but will use time and memory exponential in the product of dimensions. This state of affairs could be improved by implementing a counting based algorithm in Cython/C++ for arbitrary factors that doesn't compute recursively. The countless algorithms were developed before I knew how to write Cython and package libraries. However, C++ implementations of countless are much faster than counting for computing the first 2x2x1 mip level. In particular, an AVX2 SIMD implementation can saturate memory bandwidth.

Documentation for the countless algorithm family is located here: https://github.com/william-silversmith/countless

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tinybrain-1.7.0.tar.gz (43.3 kB view details)

Uploaded Source

Built Distributions

tinybrain-1.7.0-cp313-cp313-win_amd64.whl (356.2 kB view details)

Uploaded CPython 3.13 Windows x86-64

tinybrain-1.7.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.13 manylinux: glibc 2.17+ x86-64

tinybrain-1.7.0-cp313-cp313-macosx_11_0_arm64.whl (447.0 kB view details)

Uploaded CPython 3.13 macOS 11.0+ ARM64

tinybrain-1.7.0-cp313-cp313-macosx_10_13_x86_64.whl (517.3 kB view details)

Uploaded CPython 3.13 macOS 10.13+ x86-64

tinybrain-1.7.0-cp312-cp312-win_amd64.whl (356.5 kB view details)

Uploaded CPython 3.12 Windows x86-64

tinybrain-1.7.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

tinybrain-1.7.0-cp312-cp312-macosx_11_0_arm64.whl (455.1 kB view details)

Uploaded CPython 3.12 macOS 11.0+ ARM64

tinybrain-1.7.0-cp312-cp312-macosx_10_13_x86_64.whl (522.0 kB view details)

Uploaded CPython 3.12 macOS 10.13+ x86-64

tinybrain-1.7.0-cp311-cp311-win_amd64.whl (382.8 kB view details)

Uploaded CPython 3.11 Windows x86-64

tinybrain-1.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

tinybrain-1.7.0-cp311-cp311-macosx_11_0_arm64.whl (467.3 kB view details)

Uploaded CPython 3.11 macOS 11.0+ ARM64

tinybrain-1.7.0-cp311-cp311-macosx_10_9_x86_64.whl (555.4 kB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

tinybrain-1.7.0-cp310-cp310-win_amd64.whl (382.4 kB view details)

Uploaded CPython 3.10 Windows x86-64

tinybrain-1.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

tinybrain-1.7.0-cp310-cp310-macosx_11_0_arm64.whl (466.8 kB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

tinybrain-1.7.0-cp310-cp310-macosx_10_9_x86_64.whl (554.0 kB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

tinybrain-1.7.0-cp39-cp39-win_amd64.whl (382.9 kB view details)

Uploaded CPython 3.9 Windows x86-64

tinybrain-1.7.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

tinybrain-1.7.0-cp39-cp39-macosx_11_0_arm64.whl (467.4 kB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

tinybrain-1.7.0-cp39-cp39-macosx_10_9_x86_64.whl (554.8 kB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

File details

Details for the file tinybrain-1.7.0.tar.gz.

File metadata

  • Download URL: tinybrain-1.7.0.tar.gz
  • Upload date:
  • Size: 43.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for tinybrain-1.7.0.tar.gz
Algorithm Hash digest
SHA256 31937e73e7d87727a9a67cb1f9d8a76f4d0dcda47318cf40331f6800c1ee8e89
MD5 5d90112cd32f9292c79a32425c25d7b9
BLAKE2b-256 1bd9a4b311e5fe7765c9b160043909ae158acd4deb027029b67caee2326c45a5

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 41258f94f76c5b43613cb18bfe439b72d0a892a131ed19226e49fc2fee2e1881
MD5 7de84c5b7f10c4e47f31a60d30d3877b
BLAKE2b-256 5e4eeeb98e5218eff7551735d6ff0af9396fd16bfde02239de78aaa9bc19cf93

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e18223f303dad96800927e5abaf71ca8918567af12456e2fc42f2bd372abccb4
MD5 4d06616352c36d937387eec8d1397127
BLAKE2b-256 5583b317540cc1ed57259c6deaafd66d4a8de3a444b38edd89ab9c2e609fae4c

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 07fcbb91c18f918b8f1c517836ed6a066f48e2e556dfe90e0deeb79b00325ff2
MD5 28a630cc5be1cb53180f7ef4d955ccaa
BLAKE2b-256 d36d47288acca9a118b5d1b29205e11203ced0afec0609afdeba255489d2ad68

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp313-cp313-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp313-cp313-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 aa5a668a468bd6821c6e9900d8b9a098868de049a4b64d3a1d3f336bffbe96b2
MD5 6cc4c4eaf309da3d249ebba420d50308
BLAKE2b-256 f0fb0db2802ef44a4b8d800be8117e44c7afbce65d1888bd83a2a7c7359ccd80

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 5a1c36dced6fd6a9e1487f07fea49618ee7b8fbbfd1dc59b625f4c06c50a88a5
MD5 bc80ef7184d9837842ef3436d4cf6a6a
BLAKE2b-256 980b1603f6187700ec18b537ba4dbc16f4539f552a9be9ad24d97396276a316b

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 83b0256e12713ca2ecccba635559ed0fc43ccb7f6974e429a04a728002f59f31
MD5 9e2efef23b78a1f17b4a808f4d7a4dfd
BLAKE2b-256 9b701a0036a13b0c9467d27ef1b08a2562a5ed23e53aaced63ecff2f845db7a4

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6269fad69af8db0815f95fcff22c5daf119353dc097ce16135694f6c58683d3e
MD5 8b9916932a0159cf236d9a72b24d54e0
BLAKE2b-256 df7c3c4434d50a2c4094ccf25ba7af9ee670e722f8194ea94976a9e569de64dc

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp312-cp312-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp312-cp312-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 84598b4a60e12370e8c405f56a4d536a4711874ed1abdb9d95d798480fc48932
MD5 fd46b68713098a38a6a8104c1e63e5e5
BLAKE2b-256 8bb9df611d52d7c064da7311524bb1d880b0db84ca5f0532dad3f80141c66b57

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 083ce9b01c4b7e86318da2b573f2310bd345aa4a97f15c3e8f35b4b9ed5f1d50
MD5 7b9b2cf43194fbc93d0c5733681c6106
BLAKE2b-256 caac357e04210681bec262899fa1938d22e5cf85ae755a2b41e072435f935e41

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a318f16585c1687fe9dbe03cb6a84948c15bf4df68622fe8abbe0ae52c9eeff0
MD5 0ccbbad773f9e0de9ed04e7888a29345
BLAKE2b-256 090996c39a3b5b5f7b45eff71a023cab26e067b279299a1cbab566d4dcfda71b

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 f2494cb536f6c9ab9a8df421f3a7ec3d674092574bc5b38281be3f38be205958
MD5 73a6f85d3cbe1a1e0d6f10fe495aa72d
BLAKE2b-256 e0be6b49595ae3aed16487a8724d1f9875965b217e66274b84ec5195f0339cc6

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 784036751ac9e9e0b33b8aaa8955701221f406b065b64878e98352ded5620936
MD5 eacc254e07779ce2ec1f019246baf13f
BLAKE2b-256 67d7f6c675ee8e0a10d2f5a347efc0a64e6abbcae924ce8e456e8ad62d67a80b

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 1ca94f845f0b2c47a2f00abb4cdb6b3956ceb247f81977f3342f353468407348
MD5 2b016b8c7b842d40d705d0258c1c5dc7
BLAKE2b-256 a8ea2e664d5e719632d97fa4d07cf2998df7eec66f41a0ecb691f645b893e120

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 50f68d55f01ce215ac91ec842221f63eb9fdbc6298993d8a9f67a62e8f8e4c80
MD5 b606ddd9e237d4d30150cf0b36607522
BLAKE2b-256 2376662cc2bcdf9d16201ab8917829e258f9f2cc7442e759d665e58843404884

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d54690add8ba786222ddfdc32c6bf2d67594fd93d2c342a7f047f1b9c19fc19f
MD5 ac1984dc7ab0569019aaf7b5359b137c
BLAKE2b-256 84afa5b2e5dbe7d6191222705155cf71e7969507b7f805747a3dc369e2da8027

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 a4192003a39a83907d124e68c6db57bf3b9377c840535029a339e6716dc49652
MD5 10e86745e82cf047d4fbf0d475afc6dc
BLAKE2b-256 30291bcb14f9b4081e01261ebbf93815d7cbc3c45e7148b1200c74574b5e0fe4

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: tinybrain-1.7.0-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 382.9 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for tinybrain-1.7.0-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 6e0e82a2bb46ac9157692d8168df7d5bd04a44d41a01b7f8cb25a6dfd139f8ed
MD5 e702b93735fe37854cfc60184be1a0fd
BLAKE2b-256 a2db66416a0bcfb67b0a696fceeca9741cd6f06e6bb928ba98d4afb63e334be6

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ff637cb360046e28c345aea843d247a5659a70894e59a3428a447dea195d9b09
MD5 c923da1732133e6c002f2d71e537adcb
BLAKE2b-256 d0c2192a20ce9002875e940af97dc0131f0282ea198346ed6220a6341c0dccde

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 20da6c11915be05eb78f8b81b5ffca5afc7c11afb38fdd2b8e0cf8fb65563625
MD5 7e977877ca73f323acd5cee400437ac7
BLAKE2b-256 7c0ba5b98c8d9b5bfd65e00d2a72458dc2eaace2097413c5d117540d08e65c33

See more details on using hashes here.

File details

Details for the file tinybrain-1.7.0-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for tinybrain-1.7.0-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 e88c00825dda693fde8f382e41d7b4026cf0703d5a04775f170000e7dd6c7028
MD5 24fca6e0b578f36b10ef5e13725f3df5
BLAKE2b-256 87b55c33c079da86963fd0a7ab0b178a8e6955930200643a6b9e621b26449a88

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page