Skip to main content

Bio image reading, metadata and some affine registration.

Project description

Pytest

ndbioimage - Work in progress

Exposes (bio) images as a numpy ndarray-like object, but without loading the whole image into memory, reading from the file only when needed. Some metadata is read and stored in an ome structure. Additionally, it can automatically calculate an affine transform that corrects for chromatic aberrations etc. and apply it on the fly to the image.

Currently, it supports imagej tif files, czi files, micromanager tif sequences and anything bioformats can handle.

Installation

pip install ndbioimage

Usage

  • Reading an image file and plotting the frame at channel=2, time=1
import matplotlib.pyplot as plt
from ndbioimage import Imread
with Imread('image_file.tif', axes='ctxy', dtype=int) as im:
    plt.imshow(im[2, 1])
  • Showing some image metadata
from ndbioimage import Imread
from pprint import pprint
with Imread('image_file.tif') as im:
    pprint(im)
  • Slicing the image without loading the image into memory
from ndbioimage import Imread
with Imread('image_file.tif', axes='cztxy') as im:
    sliced_im = im[1, :, :, 100:200, 100:200]

sliced_im is an instance of Imread which will load any image data from file only when needed

  • Converting (part) of the image to a numpy ndarray
from ndbioimage import Imread
import numpy as np
with Imread('image_file.tif', axes='cztxy') as im:
    array = np.asarray(im[0, 0])

Adding more formats

Readers for image formats subclass AbstractReader. When an image reader is imported, Imread will automatically recognize it and use it to open the appropriate file format. Image readers are required to implement the following methods:

  • staticmethod _can_open(path): return True if path can be opened by this reader
  • property ome: reads metadata from file and adds them to an OME object imported from the ome-types library
  • __frame__(self, c, z, t): return the frame at channel=c, z-slice=z, time=t from the file

Optional methods:

  • open(self): maybe open some file handle
  • close(self): close any file handles

Optional fields:

  • priority (int): Imread will try readers with a lower number first, default: 99
  • do_not_pickle (strings): any attributes that should not be included when the object is pickled, for example: any file handles

TODO

  • more image formats

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ndbioimage-2024.2.0.tar.gz (127.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ndbioimage-2024.2.0-py3-none-any.whl (134.3 kB view details)

Uploaded Python 3

File details

Details for the file ndbioimage-2024.2.0.tar.gz.

File metadata

  • Download URL: ndbioimage-2024.2.0.tar.gz
  • Upload date:
  • Size: 127.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2024.2.0.tar.gz
Algorithm Hash digest
SHA256 05edfdb27e6cfaa2060f9479b73c2c21c896c0611b3bfcccd95a460e19c1f79b
MD5 0f122732a5f308f0079b8784defd7d68
BLAKE2b-256 f0ce45a06df882e87caa7b5dbe9b5729c49c28e233bad1d032b1c9c95b1c0f27

See more details on using hashes here.

File details

Details for the file ndbioimage-2024.2.0-py3-none-any.whl.

File metadata

  • Download URL: ndbioimage-2024.2.0-py3-none-any.whl
  • Upload date:
  • Size: 134.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2024.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 dfb2fb8663d2341976330fd6dad66945be0a8616e9f13a88327bef4051487301
MD5 3a78514f5493901e7f98207ac5273dd8
BLAKE2b-256 3763bd9e4d0a7088c605ed6b98793f31ec52f5c2da880f438f51613b88ca02c0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page