Longitudinal HR-pQCT timelapse analysis workflow with multistack support
Project description
TimelapsedHRpQCT v2
Quantifying remodelling activity from time-lapsed HR-pQCT images of the distal radius or tibia.
This repository is the v2 codebase. The original v1 repository is here:
https://github.com/wallematthias/TimelapsedHRpQCTv1/tree/main
Changes from v1:
- Added functionality for multistack images
- Elastix Backend for registration
- More detailed remodelling outputs
Citation
If you use this tool in a publication, please cite:
- Walle M, Whittier DE, Schenk D, Atkins PR, Blauth M, Zysset P, Lippuner K, Muller R, Collins CJ. Precision of bone mechanoregulation assessment in humans using longitudinal high-resolution peripheral quantitative computed tomography in vivo. Bone. 2023;172:116780.
For related methodology, cite:
- Whittier DE, Walle M, Schenk D, Atkins PR, Collins CJ, Zysset P, Lippuner K, Muller R. A multi-stack registration technique to improve measurement accuracy and precision across longitudinal HR-pQCT scans. Bone. 2023;176:116893.
- Walle M, Duseja A, Whittier DE, Vilaca T, Paggiosi M, Eastell R, Muller R, Collins CJ. Bone remodeling and responsiveness to mechanical stimuli in individuals with type 1 diabetes mellitus. Journal of Bone and Mineral Research. 2024;39(2):85-94.
- Walle M, Gabel L, Whittier DE, Liphardt AM, Hulme PA, Heer M, Zwart SR, Smith SM, Sibonga JD, Boyd SK. Tracking of spaceflight-induced bone remodeling reveals a limited time frame for recovery of resorption sites in humans. Science Advances. 2024;10(51):eadq3632.
What The Pipeline Does
For each subject, the pipeline can:
- Import raw AIM sessions into stack-level working artifacts.
- Generate missing full, trabecular, cortical, and segmentation volumes.
- Register each stack longitudinally across sessions.
- In multistack mode, estimate stack-to-stack correction transforms from per-stack superstacks.
- Apply the canonical final transforms once to original grayscale, mask, and segmentation data.
- Fill missing support regions in the fused transformed outputs.
- Compute pairwise remodelling and trajectory metrics.
Modes
regular: timelapse registration, transform application, and analysis without multistack correction or filling.multistack: full pipeline including stack correction and filling.
Install
Preferred installation:
pip install timelapsed-hrpqct
Python support: 3.11, 3.12, 3.13.
Minimal setup in a fresh conda environment:
conda create -n timelapsed-hrpqct python=3.13 -y
conda activate timelapsed-hrpqct
pip install timelapsed-hrpqct
Install into an existing environment:
pip install timelapsed-hrpqct
This package is pip-first and pulls runtime dependencies (including aimio-py and itk-elastix) automatically.
Development install:
pip install -e ".[test]"
Optional conda environment for local development:
conda env create -f environment.yml
conda activate timelapsed-hrpqct
The installable package name is timelapsed-hrpqct, and the import package is timelapsedhrpqct.
The CLI uses the bundled package default config (src/timelapsedhrpqct/configs/defaults.yml) automatically if you do not pass --config.
Slicer GUI (Developer Mode)
Until the extension is available in the Slicer Extensions Manager, you can use it in developer mode:
- Slicer extension repository: https://github.com/wallematthias/SlicerTimelapsedHRpQCT
Quick steps:
- Clone
TimelapsedHRpQCTSlicer. - In Slicer:
Edit -> Application Settings -> Modules. - Add module path:
<repo>/TimelapsedHRpQCTSlicer/TimelapsedHRpQCT. - Restart Slicer and open module
TimelapsedHRpQCT. - Click
Install / Update timelapsed-hrpqctinside the module.
Quick Start
Preview discovery:
timelapse import /path/to/raw_data --dry-run
By default raw files are kept in place (no sourcedata/hrpqct copy):
timelapse run /path/to/raw_data
Enable copying raw files into sourcedata/hrpqct only when desired:
timelapse run /path/to/raw_data --copy-raw-inputs
Enable moving raw files into dataset root sub-*/site-*/ses-* layout only when desired:
timelapse run /path/to/raw_data --restructure-raw
Undo restructure moves (preview first):
timelapse undo-restructure /path/to/raw_data/imported_dataset --dry-run
timelapse undo-restructure /path/to/raw_data/imported_dataset
Run the default workflow (regular mode):
timelapse run /path/to/raw_data
Run while reusing pre-existing or custom masks (skip generation):
timelapse run /path/to/raw_data --skip-mask-generation
Use this when your input already includes valid masks (for example TRAB_MASK, CORT_MASK, FULL_MASK, REGMASK, or ROI*) and you do not want the pipeline to regenerate them.
Input discovery is recursive, so your source folder can be either flat/unstructured or organized in a BIDS/MIDS-style nested layout.
When filename parsing is ambiguous, discovery can fall back to AIM header metadata (Index Patient, Index Measurement, Site).
Left/right site aliases are supported (RL/RR/TL/TR/KL/KR) while generic radius/tibia/knee remains fully supported.
Run the full multistack workflow (if needed):
timelapse run /path/to/raw_data --mode multistack
Run the regular single-stack style workflow:
timelapse run /path/to/raw_data --mode regular
Run stages manually:
timelapse import /path/to/raw_data
timelapse generate-masks /path/to/raw_data/imported_dataset
timelapse register /path/to/raw_data/imported_dataset
timelapse stackcorrect /path/to/raw_data/imported_dataset
timelapse transform /path/to/raw_data/imported_dataset
timelapse fill /path/to/raw_data/imported_dataset
timelapse analyse /path/to/raw_data/imported_dataset
Pass --config /path/to/other.yml only when you want to override the built-in default.
The default analysis space is baseline_common, which is also the fastest option. pairwise_fixed_t0 is available for single-stack datasets, but it is slower because each timepoint pair is resampled during analysis.
Incremental Reruns
The run command is incremental:
- already imported sessions are skipped
- imported stacks with complete masks/seg are skipped by mask generation
- existing baseline transforms are reused
- existing final transforms are reused
- existing fused transformed sessions are reused
- existing filled sessions are reused
- existing analysis is reused unless you pass analysis overrides like
--thr,--clusters, or--visualize
This makes it practical to rerun the pipeline after fixing one stage or adding new sessions without recomputing everything else.
Mask Roles And Naming
Discovery now supports both canonical and generic mask roles from filenames.
Examples:
# Distal radius (DR), standard trab/cort masks across sessions
SUBJ001_DR_T1.AIM
SUBJ001_DR_T1_TRAB_MASK.AIM
SUBJ001_DR_T1_CORT_MASK.AIM
SUBJ001_DR_T2.AIM
SUBJ001_DR_T2_TRAB_MASK.AIM
SUBJ001_DR_T2_CORT_MASK.AIM
SUBJ001_DR_T3.AIM
SUBJ001_DR_T3_TRAB_MASK.AIM
SUBJ001_DR_T3_CORT_MASK.AIM
# Distal tibia (DT)
SUBJ002_DT_T1.AIM
SUBJ002_DT_T1_TRAB_MASK.AIM
SUBJ002_DT_T1_CORT_MASK.AIM
# Knee (KN)
SUBJ003_KN_T1.AIM
SUBJ003_KN_T1_TRAB_MASK.AIM
SUBJ003_KN_T1_CORT_MASK.AIM
# Optional generic masks
SUBJ001_DR_T1_REGMASK.AIM
SUBJ001_DR_T1_ROI1.AIM
SUBJ001_DR_T1_ROI2.AIM
SUBJ001_DR_T1_MASK1.AIM
Behavior:
REGMASKis preferred for registration when present.- If no
REGMASKexists, registration falls back totrab+cortunion, thenfull, then genericMASK*unions. - For analysis compartments,
ROI*masks are preferred when present across sessions. - If no
ROI*masks are present,regmaskis used as analysis ROI. - Otherwise analysis uses configured compartments (or available
trab/cort/fullfallbacks).
Multistack Filename Parsing Notes
If your raw files are already split into physical stacks, include a stack token in the filename:
SUBJ001_DT_STACK01_T1.AIM
SUBJ001_DT_STACK01_T1_TRAB_MASK.AIM
SUBJ001_DT_STACK01_T1_CORT_MASK.AIM
SUBJ001_DT_STACK02_T1.AIM
SUBJ001_DT_STACK02_T1_TRAB_MASK.AIM
SUBJ001_DT_STACK02_T1_CORT_MASK.AIM
Accepted stack token styles include STACK01, STACK_01, and STACK-01.
Notes:
- If
STACK...is present, files are grouped by that stack index during discovery. - If
STACK...is missing, the image is treated as a single acquisition and import splits byimport.stack_depth(default168). - If site token is missing, discovery uses
discovery.default_site(defaulttibia). REGMASKis optional and overrides registration mask selection when present.ROI*masks are optional and override analysis compartments when consistently present across sessions.
Repository Layout
src/timelapsedhrpqct/workflows/: orchestration for each pipeline stagesrc/timelapsedhrpqct/processing/: reusable algorithmic and I/O helperssrc/timelapsedhrpqct/dataset/: discovery, layout, artifact records, derivative pathssrc/timelapsedhrpqct/analysis/: remodelling analysis logicsrc/timelapsedhrpqct/configs/: bundled default YAML configurationtests/: unit, characterization, and end-to-end workflow tests
Documentation
Detailed documentation lives in docs/:
- Documentation Index
- Installation
- Usage
- Usage Examples
- Annotated Defaults
- Multistack Algorithm
- Timelapsed Analysis
- Settings Reference
- API Reference
Testing
Run the full test suite:
pytest -q
License
This repository is licensed under the MIT License. See LICENSE.
Packaging
The repository includes:
environment.ymlfor local conda environments.github/workflows/ci.ymlfor tests and pip install smoke checks.github/workflows/publish-pypi.ymlfor trusted-publisher PyPI releasesconda-recipe/for conda packaging
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file timelapsed_hrpqct-2.0.13.tar.gz.
File metadata
- Download URL: timelapsed_hrpqct-2.0.13.tar.gz
- Upload date:
- Size: 133.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5e415d2393c198983bf7b59525be841e431a123c36dd0e0399916ec5ebeab0e2
|
|
| MD5 |
05f450c438081459f43ba814c324f76b
|
|
| BLAKE2b-256 |
7b4b3c7d50c81e2265b0c09e3d7555c38c6eec39b704cd86ffb9e780630b8a4d
|
Provenance
The following attestation bundles were made for timelapsed_hrpqct-2.0.13.tar.gz:
Publisher:
publish-pypi.yml on wallematthias/TimelapsedHRpQCT
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
timelapsed_hrpqct-2.0.13.tar.gz -
Subject digest:
5e415d2393c198983bf7b59525be841e431a123c36dd0e0399916ec5ebeab0e2 - Sigstore transparency entry: 1258494548
- Sigstore integration time:
-
Permalink:
wallematthias/TimelapsedHRpQCT@a69249b755e9d290d93e99b11cde631e059138af -
Branch / Tag:
refs/tags/v2.0.13 - Owner: https://github.com/wallematthias
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@a69249b755e9d290d93e99b11cde631e059138af -
Trigger Event:
push
-
Statement type:
File details
Details for the file timelapsed_hrpqct-2.0.13-py3-none-any.whl.
File metadata
- Download URL: timelapsed_hrpqct-2.0.13-py3-none-any.whl
- Upload date:
- Size: 125.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2e932e42da99ffd9af2894fcce8d6785875c86c6244700c2fc676d6d4d357528
|
|
| MD5 |
5dd6020ab02dd777c19fb6df935af98e
|
|
| BLAKE2b-256 |
4dcbc0c093ae1bee702c96b5d69ae4f49d69d85c5eabbc1d9f391b2248f6bd3e
|
Provenance
The following attestation bundles were made for timelapsed_hrpqct-2.0.13-py3-none-any.whl:
Publisher:
publish-pypi.yml on wallematthias/TimelapsedHRpQCT
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
timelapsed_hrpqct-2.0.13-py3-none-any.whl -
Subject digest:
2e932e42da99ffd9af2894fcce8d6785875c86c6244700c2fc676d6d4d357528 - Sigstore transparency entry: 1258494555
- Sigstore integration time:
-
Permalink:
wallematthias/TimelapsedHRpQCT@a69249b755e9d290d93e99b11cde631e059138af -
Branch / Tag:
refs/tags/v2.0.13 - Owner: https://github.com/wallematthias
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@a69249b755e9d290d93e99b11cde631e059138af -
Trigger Event:
push
-
Statement type: