Skip to main content

simple tools for creating image pyramids

Project description

pyramid_sampler

A small utility for taking a 3D zarr image at a single resolution and downsampling to create an image pyramid.

installation

python -m pip install pyramid-sampler

usage

create a test base image

import zarr
from pyramid_sampler import initialize_test_image

# create an on-disk zarr store
zarr_file = "test_image.zarr"
zarr_store = zarr.group(zarr_file)

# write base level 0 image to the specified store and field name
new_field = "field1"
base_res = (1024, 1024, 1024)
chunks = (64, 64, 64)
initialize_test_image(zarr_store, new_field, base_res, chunks=chunks)

initialize_test_image will utilize a dask.delayed workflow, so you can configure a dask client prior to calling initilize_test_iamge if you wish. The resulting base image will reside in zarr_store[new_field][0].

downsampling an image

First initialize a Downsampler instance with the path to the zarr store, the refinement factor to use between image levels, the base image resolution and chunksizes:

from pyramid_sampler import Downsampler

zarr_file = "test_image.zarr"
refinement_factor = (2, 2, 2)
base_res = (1024, 1024, 1024)
chunks = (64, 64, 64)
ds = Downsampler(zarr_file, (2, 2, 2), base_res, chunks)

For now, this assumes your base image will reside in zarr_file/field_name/0. To run the downsampling,

field_to_downsample = "field1"
max_levels = 10
ds.downsample(max_levels, field_to_downsample)

Downsampling will only proceed until a layer is created with a single chunk of size set by the Downsampler and image chunksize, i.e., until base_resolution / refinement_factor**current_level / chunks has a value of 1 in any dimension, or max_levels = log(base_resolution/chunks) / log(refinement_factor) (giving a max level id of max_levels - 1 to account for 0-indexing).

assumptions

Some assumptions in the current algorithm:

  • exact chunks only! The base image resoultion and chunks must perfectly subdivide and downsampling must result in an even number of chunks.
  • the base field exists and is stored at zarr_store[field][0]
  • Only tested with on-disk filestores, but should work for any zarr store.

method

at present, the downsampling simply averages overlapping array elements: for a given image level, L1, the pixels of the higher resoluiton image level L1 - 1 covered by each pixel in L1 are found and averaged. Levels are built up sequentially (i.e., L1 is built from L1 - 1, not the base resolution).

Calculations and chunk-processing are accelerated with dask delayed objects using numba.jit compilation for the pixel-averaging.

developing & contributing

At present, this package is a small utility used for experimentations with zarr files. But contributions are welcome! Open up an issue at https://github.com/data-exp-lab/pyramid_sampler/issues to discuss ideas.

cutting a new release

Notes for maintainers on cutting a new release:

  1. create and push a new tag
git tag v0.1.0
git push upstream v0.1.0
  1. create new a release on github via the release interface, using the tag you just pushed.
  2. on publishing the new github release, a github action .github/workflows/cd.yml runs. This action builds the distribution and pushes to pypi.

Note that the publication to pypi in step 3 uses a Trusted Publisher configured by @chris.havlin

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyramid_sampler-0.1.0.tar.gz (494.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyramid_sampler-0.1.0-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file pyramid_sampler-0.1.0.tar.gz.

File metadata

  • Download URL: pyramid_sampler-0.1.0.tar.gz
  • Upload date:
  • Size: 494.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pyramid_sampler-0.1.0.tar.gz
Algorithm Hash digest
SHA256 0acdd3900e11d4f8589763aa837df1b08e6a63e03d679d5df18daa656ea2a430
MD5 601b718eadf3251b0b8976cada70e853
BLAKE2b-256 118903af13338e1358686c2293b64d273c749ed4dc852fdc30f100eea5180c11

See more details on using hashes here.

Provenance

The following attestation bundles were made for pyramid_sampler-0.1.0.tar.gz:

Publisher: cd.yml on data-exp-lab/pyramid_sampler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pyramid_sampler-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for pyramid_sampler-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0fe3fce11c7839c17049d48cd533e92603683cd03a5538fecd8cca92ecd6bbf4
MD5 e781e9d2acf1a2837bdce29aac02ffc8
BLAKE2b-256 020708bfe0da9ea65944022f8785d9164a60d3b35e20b9cbcdebf0fb6cc1b8cd

See more details on using hashes here.

Provenance

The following attestation bundles were made for pyramid_sampler-0.1.0-py3-none-any.whl:

Publisher: cd.yml on data-exp-lab/pyramid_sampler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page