Python bindings for Atlas zimg (nanobind)
Project description
zimg
Python bindings for Atlas’ image I/O + processing library (built with nanobind).
The main entry point is ZImg, a multidimensional image container designed for
fast CPU array interop, region-of-interest (ROI) access, and common microscopy /
scientific image formats.
Highlights
- Multidimensional images:
C,Z,Y,X, andT(time is represented as a list of arrays in Python). - Read image metadata without loading full pixel data (
ZImg.readImgInfos,ZImg.readImgInfo). - Read full images or ROIs via
ZImgRegion(end coordinate is exclusive). - Fast CPU array interop via
ZImg.data/ZImg.to_arrays()(NumPy / Torch / TensorFlow / JAX / Array API / memoryview). - Zero-copy array wrapping when possible (CPU C-contiguous +
layout="CZYX"), withcopy_if_neededto enforce or relax this. - Sub-block / tile access (
ZImg.readSubBlockLists,ZImg.readSubBlock) for formats that support it. - Streaming writers via Python-implemented providers (
ZImg.writeImg+ZImgSliceProvider/ZImgBlockProvider). - Save images (
ZImg.save) with optionalZImgWriteParameters(compression, etc.). - neuTube tracing / skeletonization workers exposed directly to Python:
ZNeutubeSkeletonize,ZNeutubeAutoTrace,ZNeutubeBlockedAutoTrace,ZSwcSubtract, andTraceConfig. - Embedded neuTube JSON presets available directly in Python via
zimg.neutube_json.
Installation
- Requires Python
>= 3.12. - Requires NumPy (installed automatically by
pip install zimg). - If a prebuilt wheel is available for your platform:
pip install zimg. - If
pipfalls back to building from source, see “Building from source” below.
Quickstart
import zimg
img = zimg.ZImg("example.ome.tif")
arr0 = img.to_arrays("numpy")[0] # t = 0
print(arr0.shape) # (C, Z, Y, X)
print(arr0.dtype)
img.save("out.tif")
neuTube processing
The package exposes neuTube processing workers directly. They can be configured
from Python with setter methods, or loaded/saved as task files via
loadTask(...) / saveTask(...).
Available classes:
ZNeutubeSkeletonize: binary image to SWC skeletonization.ZNeutubeAutoTrace: whole-volume auto tracing on a selected channel / timepoint.ZNeutubeBlockedAutoTrace: blocked auto tracing for large datasets.ZSwcSubtract: subtract one or more SWCs from an input SWC.TraceConfig: algorithm-override struct for tracing score / behavior knobs.
ZImgSource input model
These tracing workers accept ZImgSource, not just a plain filename. This
matches Atlas' native image-source model and supports:
- single files
- file lists
- scene selection
- ROI selection
- explicit format hints
For simple single-file use, setInputImagePath(...) is still available.
import zimg
source = zimg.ZImgSource("signal.ome.tif")
source.scene = 0
source.region = zimg.ZImgRegion((0, 0, 0, 0, 0), (-1, -1, -1, 1, 1))
Embedded config presets
Current neuTube tracing / skeletonization presets are embedded directly into the
Python package as zimg.neutube_json.
- Parsed JSON presets are exposed as Python
dictvalues such aszimg.neutube_json.SKELETONIZE,zimg.neutube_json.FLYEM_SKELETONIZE,zimg.neutube_json.TRACE_CONFIG, andzimg.neutube_json.TRACE_CONFIG_BIOCYTIN. - JSON text is also available through
get_preset_text(...). write_preset(...)can materialize a preset to disk when a file is needed.
The tracing/skeletonization workers accept either:
- a config file path, via
setSkeletonizeConfigPath(...)orsetTraceConfigPath(...) - an inline Python
dict, viasetSkeletonizeConfig(...)orsetTraceConfig(...)
Skeletonize a binary image to SWC
import zimg
proc = zimg.ZNeutubeSkeletonize()
proc.setInputImageSource(zimg.ZImgSource("binary_mask.tif"))
proc.setSkeletonizeConfig(zimg.neutube_json.SKELETONIZE)
proc.setOutputSwcPath("binary_mask.swc")
proc.run()
print(proc.hasResult(), proc.outputSwcPath())
Auto trace a signal volume
import zimg
proc = zimg.ZNeutubeAutoTrace()
proc.setInputImageSource(zimg.ZImgSource("signal.ome.tif"))
proc.setSelectedChannelTime(0, 0)
proc.setTraceConfig(zimg.neutube_json.TRACE_CONFIG)
proc.setOutputSwcPath("autotrace.swc")
# Optional: override selected tracing parameters only when you already know
# which fields you want to change for your dataset.
#
# overrides = zimg.TraceConfig()
# overrides.minAutoScore = ...
# overrides.seedMethod = ...
# proc.setAlgoConfigOverrides(overrides)
proc.run()
Blocked auto trace for large datasets
ZNeutubeBlockedAutoTrace is intended for larger datasets where tracing
should run block-by-block instead of materializing the whole volume at once.
When possible, it derives metadata such as dataset shape and z-scale directly
from the provided ZImgSource.
It also exposes advanced tiling controls such as block core size and halo, but those values are workload-specific. If you do not already have tuned settings for a dataset, start with the worker defaults instead of guessing block sizes in Python.
import zimg
proc = zimg.ZNeutubeBlockedAutoTrace()
proc.setInputImageSource(zimg.ZImgSource("large_signal.ome.zarr"))
proc.setSelectedChannelTime(0, 0)
proc.setSignalDownsampleRatio([2, 2, 1])
proc.setTraceConfig(zimg.neutube_json.TRACE_CONFIG)
proc.setOutputSwcPath("blocked_autotrace.swc")
proc.setOutputSessionDir("blocked_autotrace_session")
proc.run()
Subtract SWCs
import zimg
proc = zimg.ZSwcSubtract()
proc.setInputSwcFilename("full_tree.swc")
proc.setSubtractSwcFilenames(["artifact_1.swc", "artifact_2.swc"])
proc.setOutputSwcFilename("cleaned_tree.swc")
proc.run()
Reading a region (ROI)
ZImgRegion is defined by (x, y, z, c, t) start/end coordinates, where end
is not included. Use -1 for any end component to mean “to the end” of
that dimension.
import zimg
region = zimg.ZImgRegion((0, 0, 0, 0, 0), (256, 256, 64, -1, 1))
img = zimg.ZImg("big.ome.tif", region=region)
arr0 = img.to_arrays("numpy")[0]
Creating from arrays
The canonical layout is CZYX. Pass layout= if your arrays use a different
dimension order. For CPU arrays, ZImg will wrap zero-copy when possible:
- If
layout="CZYX"and the input is CPU C-contiguous: typically zero-copy. - Otherwise: it will copy unless
copy_if_needed=False(then it raises).
import numpy as np
import zimg
arr = np.zeros((1, 1, 64, 64), dtype=np.uint16) # C, Z, Y, X
img = zimg.ZImg(arr, layout="CZYX", copy_if_needed=False) # enforce zero-copy
img.save("zeros.tif")
ZImg.to_arrays(framework="auto") will return CPU arrays in the requested
framework. With framework="auto", it mirrors the input framework if the image
was created zero-copy from arrays; otherwise it returns NumPy arrays.
Sub-block / tiled IO
Some formats store images as sub-blocks / tiles (e.g. pyramid levels, chunked
layouts). zimg exposes these via:
ZImg.readSubBlockLists(...)→ per-scene list of NumPy int64 arrays describing sub-blocks. Each sub-block record contains(t, x, y, z, width, height, depth, xRatio, yRatio, zRatio).ZImg.readSubBlock(...)→ read an individual sub-block by(scene, blockIndex).
Streaming writes (slice/block providers)
ZImg.writeImg(...) can write from a provider instead of requiring a full
in-memory ZImg. Implement the provider interface in Python:
ZImgSliceProvider: implementimgInfo(),slice(z, t),allSlices(t),wholeImg().ZImgBlockProvider: implementimgInfo(),numBlocks(),blockCoord(i),block(i),wholeImg().
This is useful for very large datasets or pipelines that generate data incrementally.
Compression parameters
ZImgWriteParameters exposes common compression knobs (availability depends on
the file format):
compression(seezimg.Compression)zlibCompressionLeveljpegQuality,jpegProgressive,jpegChrominanceSubsampling,jpegAccurateDCTjpegXRQuality
File formats
Supported file formats depend on how the wheel/source was built. The
zimg.FileFormat enum includes:
Tiff,OmeTiff,Png,Jpeg,JpegXRZeissCZI,ZeissLsm,LeicaVaa3DRaw,HDF5Img,MetaImage,ITKImageFreeImage(optional; omitted when built with-DZIMG_DISABLE_FREEIMAGE=ON)
Notes / limitations
ZImg.to_arrays()andZImg.dataexpose CPU-backed arrays that reference theZImgbuffers (the Python arrays keep the parentZImgalive).- GPU arrays are not supported.
Building from source
This package reuses Atlas’ native CMake build. Building from source generally requires the same native dependencies as Atlas (compiler toolchain, Qt, and the Atlas third-party libraries built/configured).
From the repo root:
cd python/zimg
python -m pip install .
If you build Atlas via conda recipes, ensure the expected third-party artifacts
are present (e.g., src/3rdparty/build/) before building this wheel.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file zimg-1.0.8-cp312-abi3-win_amd64.whl.
File metadata
- Download URL: zimg-1.0.8-cp312-abi3-win_amd64.whl
- Upload date:
- Size: 75.9 MB
- Tags: CPython 3.12+, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f8557c7c0371b356bc6497c8be5fdf77ebf780c38ab24be57d718bc08f55a930
|
|
| MD5 |
3e2dfb4ae4e2f1e0d8964c9c9b3c42ea
|
|
| BLAKE2b-256 |
f2f4693e9663cbd2dcafe6474c220b965022bc470aaf57ffb7e0802a0fdb2a35
|
File details
Details for the file zimg-1.0.8-cp312-abi3-manylinux_2_35_x86_64.whl.
File metadata
- Download URL: zimg-1.0.8-cp312-abi3-manylinux_2_35_x86_64.whl
- Upload date:
- Size: 97.4 MB
- Tags: CPython 3.12+, manylinux: glibc 2.35+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ba4f17b7cc796d47233dde6fb8235df119cfc17b33e67dad871198f0fc143b39
|
|
| MD5 |
3b76d04af33025a2a6881ece976a2997
|
|
| BLAKE2b-256 |
e4b7db86c14f6b489ac78c00116c0c2ae8a8e3801a996d93f1a43c3264194624
|
File details
Details for the file zimg-1.0.8-cp312-abi3-macosx_12_0_universal2.whl.
File metadata
- Download URL: zimg-1.0.8-cp312-abi3-macosx_12_0_universal2.whl
- Upload date:
- Size: 90.5 MB
- Tags: CPython 3.12+, macOS 12.0+ universal2 (ARM64, x86-64)
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2f03f4c9331fccaadfc78a18c361e17ffb6f08192a7b4883257a3a3395a33481
|
|
| MD5 |
858d3e1ab354a7491960406a77e8c973
|
|
| BLAKE2b-256 |
c59fe81d328fb46110fabc05ded733ccba19ee25fe03bc08250931dc498265b2
|