Skip to main content

TODO.

Project description

auxiliary

Python Versions Stable Version Documentation Status License

Auxiliary is a Python package providing utility functions for medical image processing. It is part of the BrainLesion project and offers tools for:

  • Image I/O: Reading and writing medical images (NIfTI, TIFF, DICOM) using SimpleITK
  • Image Normalization: Percentile-based and windowing normalization methods
  • Format Conversion: DICOM to NIfTI and NIfTI to DICOM conversion
  • Path Utilities: Robust path handling with the turbopath module

Installation

With a Python 3.10+ environment, you can install auxiliary directly from PyPI:

pip install auxiliary

Or via conda:

conda install conda-forge::auxiliary

Optional Dependencies

For DICOM to NIfTI conversion using dcm2niix:

pip install auxiliary[dcm2niix]

Usage

NIfTI I/O

from auxiliary.io import read_image, write_image

# Read a NIfTI image
image_array = read_image("path/to/image.nii.gz")

# Write a NumPy array to a NIfTI file
write_image(image_array, "path/to/output.nii.gz")

# Write with reference image for spatial metadata
write_image(image_array, "path/to/output.nii.gz", reference_path="path/to/reference.nii.gz")

DICOM I/O

from auxiliary.conversion import dicom_to_nifti_itk, nifti_to_dicom_itk, dcm2niix
import numpy as np

# Read a DICOM series and convert to NIfTI using SimpleITK
dicom_to_nifti_itk("path/to/dicom_dir", "path/to/output_dir")

# Read a DICOM series and convert to NIfTI using dcm2niix (requires dcm2niix extra)
dcm2niix("path/to/dicom_dir", "path/to/output_dir")

# Write a NIfTI image to DICOM format
nifti_to_dicom_itk("path/to/image.nii.gz", "path/to/output_dicom_dir")

# Write a NumPy array to DICOM format
image_array = np.random.rand(128, 128, 64)  # example 3D array
nifti_to_dicom_itk(image_array, "path/to/output_dicom_dir")

# Write a NumPy array to DICOM with reference DICOM for metadata
nifti_to_dicom_itk(
    image_array,
    "path/to/output_dicom_dir",
    reference_dicom="path/to/reference_dicom_dir"
)

TIFF I/O

from auxiliary.tiff.io import read_tiff, write_tiff

# Read a TIFF file
tiff_data = read_tiff("path/to/image.tiff")

# Write a NumPy array to a TIFF file
write_tiff(tiff_data, "path/to/output.tiff")

Image Normalization

from auxiliary.normalization.percentile_normalizer import PercentileNormalizer
from auxiliary.normalization.windowing_normalizer import WindowingNormalizer

# Percentile-based normalization
normalizer = PercentileNormalizer(lower_percentile=1.0, upper_percentile=99.0)
normalized_image = normalizer.normalize(image_array)

# Windowing normalization (e.g., for CT images)
normalizer = WindowingNormalizer(center=40, width=400)
windowed_image = normalizer.normalize(image_array)

Citation

[!IMPORTANT] If you use auxiliary in your research, please cite it to support the development!

Kofler, F., Rosier, M., Astaraki, M., Möller, H., Mekki, I. I., Buchner, J. A., Schmick, A., Pfiffer, A., Oswald, E., Zimmer, L., Rosa, E. de la, Pati, S., Canisius, J., Piffer, A., Baid, U., Valizadeh, M., Linardos, A., Peeken, J. C., Shit, S., … Menze, B. (2025). BrainLesion Suite: A Flexible and User-Friendly Framework for Modular Brain Lesion Image Analysis arXiv preprint arXiv:2507.09036

@misc{kofler2025brainlesionsuiteflexibleuserfriendly,
      title={BrainLesion Suite: A Flexible and User-Friendly Framework for Modular Brain Lesion Image Analysis}, 
      author={Florian Kofler and Marcel Rosier and Mehdi Astaraki and Hendrik Möller and Ilhem Isra Mekki and Josef A. Buchner and Anton Schmick and Arianna Pfiffer and Eva Oswald and Lucas Zimmer and Ezequiel de la Rosa and Sarthak Pati and Julian Canisius and Arianna Piffer and Ujjwal Baid and Mahyar Valizadeh and Akis Linardos and Jan C. Peeken and Surprosanna Shit and Felix Steinbauer and Daniel Rueckert and Rolf Heckemann and Spyridon Bakas and Jan Kirschke and Constantin von See and Ivan Ezhov and Marie Piraud and Benedikt Wiestler and Bjoern Menze},
      year={2025},
      eprint={2507.09036},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2507.09036}, 
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

auxiliary-0.4.2.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

auxiliary-0.4.2-py3-none-any.whl (18.3 kB view details)

Uploaded Python 3

File details

Details for the file auxiliary-0.4.2.tar.gz.

File metadata

  • Download URL: auxiliary-0.4.2.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for auxiliary-0.4.2.tar.gz
Algorithm Hash digest
SHA256 8b88da00628ceaaa28acb452cd6ddcb3638763d8e6bfd8c460be316957457335
MD5 e29733c8079cbb01625c38936166452c
BLAKE2b-256 8483318b728366ff7a4b7c27f2dd3bdb0fd43f18779af107add6ae1b1a7ef1d9

See more details on using hashes here.

File details

Details for the file auxiliary-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: auxiliary-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 18.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for auxiliary-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 6edafe6d9c5189b5ee72fc83b707afb1c801c36470b4de54f161e808e6afcda8
MD5 ec432086e4569043976c25c9a61b8f3e
BLAKE2b-256 eadacdb81af31783f3f73143b543e9665365b291b99fdc1da7eecbeaecd964b4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page