Skip to main content

simple tools for creating image pyramids

Project description

pyramid_sampler

A small utility for taking a 3D zarr image at a single resolution and downsampling to create an image pyramid.

installation

python -m pip install pyramid-sampler

usage

create a test base image

import zarr
from pyramid_sampler import initialize_test_image

# create an on-disk zarr store
zarr_file = "test_image.zarr"
zarr_store = zarr.group(zarr_file)

# write base level 0 image to the specified store and field name
new_field = "field1"
base_res = (1024, 1024, 1024)
chunks = (64, 64, 64)
initialize_test_image(zarr_store, new_field, base_res, chunks=chunks)

initialize_test_image will utilize a dask.delayed workflow, so you can configure a dask client prior to calling initilize_test_iamge if you wish. The resulting base image will reside in zarr_store[new_field][0].

downsampling an image

First initialize a Downsampler instance with the path to the zarr store, the refinement factor to use between image levels, the base image resolution and chunksizes:

from pyramid_sampler import Downsampler

zarr_file = "test_image.zarr"
refinement_factor = (2, 2, 2)
base_res = (1024, 1024, 1024)
chunks = (64, 64, 64)
ds = Downsampler(zarr_file, (2, 2, 2), base_res, chunks)

For now, this assumes your base image will reside in zarr_file/field_name/0. To run the downsampling,

field_to_downsample = "field1"
max_levels = 10
ds.downsample(max_levels, field_to_downsample)

Downsampling will only proceed until a layer is created with a single chunk of size set by the Downsampler and image chunksize, i.e., until base_resolution / refinement_factor**current_level / chunks has a value of 1 in any dimension, or max_levels = log(base_resolution/chunks) / log(refinement_factor) (giving a max level id of max_levels - 1 to account for 0-indexing).

assumptions

Some assumptions in the current algorithm:

  • exact chunks only! The base image resoultion and chunks must perfectly subdivide and downsampling must result in an even number of chunks.
  • the base field exists and is stored at zarr_store[field][0]
  • Only tested with on-disk filestores, but should work for any zarr store.

method

at present, the downsampling simply averages overlapping array elements: for a given image level, L1, the pixels of the higher resoluiton image level L1 - 1 covered by each pixel in L1 are found and averaged. Levels are built up sequentially (i.e., L1 is built from L1 - 1, not the base resolution).

Calculations and chunk-processing are accelerated with dask delayed objects using numba.jit compilation for the pixel-averaging.

developing & contributing

At present, this package is a small utility used for experimentations with zarr files. But contributions are welcome! Open up an issue at https://github.com/data-exp-lab/pyramid_sampler/issues to discuss ideas.

cutting a new release

Notes for maintainers on cutting a new release:

  1. create and push a new tag
git tag v0.1.0
git push upstream v0.1.0
  1. create new a release on github via the release interface, using the tag you just pushed.
  2. on publishing the new github release, a github action .github/workflows/cd.yml runs. This action builds the distribution and pushes to pypi.

Note that the publication to pypi in step 3 uses a Trusted Publisher configured by @chris.havlin

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyramid_sampler-0.0.3.tar.gz (494.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyramid_sampler-0.0.3-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file pyramid_sampler-0.0.3.tar.gz.

File metadata

  • Download URL: pyramid_sampler-0.0.3.tar.gz
  • Upload date:
  • Size: 494.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for pyramid_sampler-0.0.3.tar.gz
Algorithm Hash digest
SHA256 e4294d2b6e28621f0793719e732347cdbfc2a9ca98d3863473a7f68ec2d93967
MD5 e73c814fbe8ffb5e39242970a7d38301
BLAKE2b-256 f8c8552a3714dc0f81acc7a5a3061378533b1973f714625b920ab9a8c8ced83f

See more details on using hashes here.

Provenance

The following attestation bundles were made for pyramid_sampler-0.0.3.tar.gz:

Publisher: cd.yml on data-exp-lab/pyramid_sampler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pyramid_sampler-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for pyramid_sampler-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 caf4efa48a9b1d95fff3effa78a69a30711f36e1ebe18b9d8c6727ec91692ebe
MD5 7b8f0fcf679d747482060bff8c27208a
BLAKE2b-256 850734766e46f69c6d8aabc4f024edf01aeb0cddfb3a78b019b3dab9d0a2540d

See more details on using hashes here.

Provenance

The following attestation bundles were made for pyramid_sampler-0.0.3-py3-none-any.whl:

Publisher: cd.yml on data-exp-lab/pyramid_sampler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page