Skip to main content

Bio image reading, metadata and some affine registration.

Project description

Pytest

ndbioimage - Work in progress

Exposes (bio) images as a numpy ndarray-like object, but without loading the whole image into memory, reading from the file only when needed. Some metadata is read and stored in an ome structure. Additionally, it can automatically calculate an affine transform that corrects for chromatic aberrations etc. and apply it on the fly to the image.

Currently, it supports imagej tif files, czi files, micromanager tif sequences and anything bioformats can handle.

Installation

pip install ndbioimage

Usage

  • Reading an image file and plotting the frame at channel=2, time=1
import matplotlib.pyplot as plt
from ndbioimage import Imread
with Imread('image_file.tif', axes='ctxy', dtype=int) as im:
    plt.imshow(im[2, 1])
  • Showing some image metadata
from ndbioimage import Imread
from pprint import pprint
with Imread('image_file.tif') as im:
    pprint(im)
  • Slicing the image without loading the image into memory
from ndbioimage import Imread
with Imread('image_file.tif', axes='cztxy') as im:
    sliced_im = im[1, :, :, 100:200, 100:200]

sliced_im is an instance of Imread which will load any image data from file only when needed

  • Converting (part) of the image to a numpy ndarray
from ndbioimage import Imread
import numpy as np
with Imread('image_file.tif', axes='cztxy') as im:
    array = np.asarray(im[0, 0])

Adding more formats

Readers for image formats subclass AbstractReader. When an image reader is imported, Imread will automatically recognize it and use it to open the appropriate file format. Image readers are required to implement the following methods:

  • staticmethod _can_open(path): return True if path can be opened by this reader
  • property ome: reads metadata from file and adds them to an OME object imported from the ome-types library
  • __frame__(self, c, z, t): return the frame at channel=c, z-slice=z, time=t from the file

Optional methods:

  • open(self): maybe open some file handle
  • close(self): close any file handles

Optional fields:

  • priority (int): Imread will try readers with a lower number first, default: 99
  • do_not_pickle (strings): any attributes that should not be included when the object is pickled, for example: any file handles

TODO

  • more image formats

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ndbioimage-2023.12.0.tar.gz (125.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ndbioimage-2023.12.0-py3-none-any.whl (132.9 kB view details)

Uploaded Python 3

File details

Details for the file ndbioimage-2023.12.0.tar.gz.

File metadata

  • Download URL: ndbioimage-2023.12.0.tar.gz
  • Upload date:
  • Size: 125.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2023.12.0.tar.gz
Algorithm Hash digest
SHA256 89a7c8ea81cc952df0d32d9255492fb2fb9e49a6391cc19bc5b1fa818029b076
MD5 2c88d32f9eb94001aed92dd284a812a1
BLAKE2b-256 86e434b29d0e61afecb5c7d8aa1fc8d0864a033e6730b8eae4037c95bb58493f

See more details on using hashes here.

File details

Details for the file ndbioimage-2023.12.0-py3-none-any.whl.

File metadata

  • Download URL: ndbioimage-2023.12.0-py3-none-any.whl
  • Upload date:
  • Size: 132.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2023.12.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b4275779185433f51a6a072501521f07cce592ce26fd7e815693d17beda813da
MD5 839ab2f10978f54c9cb743f962508872
BLAKE2b-256 aed25e189ea624a48226094db2e1d03938f950f5326fc7f086e594dc342c7bf0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page