Radio interferometric imaging suite based on a preconditioned forward-backward approach
Project description
pfb-imaging
Radio interferometric imaging suite based on the preconditioned forward-backward algorithm. The project follows the hip-cargo package format: lightweight CLI installation with auto-generated stimela cab definitions and containerised execution.
Installation
Lightweight (CLI + cabs only):
pip install pfb-imaging
This installs the CLI and stimela cab definitions without the full scientific stack. The cabs can be included in stimela recipes using:
_include:
- (pfb_imaging.cabs)init.yml
Full stack:
To run the code natively you need to install the full stack using
pip install "pfb-imaging[full]"
For maximum performance install ducc0 in no-binary mode:
pip install ducc0 --no-binary ducc0
See the Development section for instructions on how to set the package up in development mode and make contributions.
Quick start
The easiest way to use pfb-imaging is via the stimela recipes given in the recipes folder.
Once the package is installed, a recipe can be queried for its input and output parameters using the stimela doc command.
For example, to see the inputs and outputs of the sara recipe, simply run
stimela doc 'pfb_imaging.recipes::sara.yaml'
The recipe can then be run with the stimela run command:
stimela run 'pfb_imaging.recipes::sara.yaml' sara \
ms=path/to/data.ms \
base-dir=path/to/base/output/directory \
image-name=saraout
The recipe should contain sensible defaults for MeerKAT data at L-band.
CLI documentation
The CLI is built with Typer and provides rich, auto-generated documentation. To list all available commands:
pfb --help
To get detailed documentation for a specific command including all parameters, types, and defaults:
pfb init --help
This is often more useful than stimela doc as it shows the full parameter documentation with types and defaults directly in the terminal.
CLI commands
The processing pipeline follows a modular pattern where each step is a separate command:
pfb init-- Parse measurement sets into xarray datasetspfb grid-- Create dirty images, PSFs, and weightspfb kclean-- Classical deconvolution (Hogbom/Clark)pfb sara-- Advanced deconvolution with sparsity constraintspfb restore-- Restore clean components to final imagepfb degrid-- Subtract model from visibilities
Additional commands:
pfb deconv-- General deconvolution (replaces individual algorithm apps)pfb hci-- High cadence imagingpfb fluxtractor-- Flux extractionpfb model2comps-- Convert model to components
Execution backends
Every command supports a --backend option that controls how the command is executed.
This is provided by hip-cargo and enables container fallback execution: when the full scientific stack is not installed locally, commands automatically run inside a container.
Available backends:
auto(default) -- Try native execution first; if the core module import fails (lightweight install), fall back to the best available container runtime.native-- Run natively using the locally installed Python environment. Fails withImportErrorif dependencies are missing.docker-- Run inside a Docker container.podman-- Run inside a Podman container (daemonless, rootless).apptainer-- Run inside an Apptainer container (HPC-friendly, formerly Singularity).singularity-- Run inside a Singularity container.
An additional --always-pull-images flag forces re-pulling the container image before execution, useful for ensuring you have the latest version.
Example usage:
# Run natively (requires full install)
pfb init --ms data.ms --output-filename out --backend native
# Run in a Docker container (lightweight install only)
pfb init --ms data.ms --output-filename out --backend docker
# Auto-detect: native if available, otherwise container
pfb init --ms data.ms --output-filename out
Volume mounts are resolved automatically from the command's type hints: input paths are mounted read-only, output paths read-write. Docker and Podman run as the current user to avoid root-owned output files.
Default naming conventions
Output files follow consistent naming patterns using --output-filename, --product, and --suffix:
- XDS datasets:
{output_filename}_{product}.xds - DDS datasets:
{output_filename}_{product}_{suffix}.dds - Models:
{output_filename}_{product}_{suffix}_model.mds - FITS files: same convention with appropriate extensions
The --suffix parameter (default main) allows imaging multiple fields from a single set of corrected Stokes visibilities.
For example, the sun can be imaged by setting --target sun --suffix sun.
The --target parameter accepts any object recognised by astropy or HH:MM:SS,DD:MM:SS format.
Parallelism settings
Two settings control parallelism:
--nworkerscontrols how many chunks (usually imaging bands) are processed in parallel.--nthreadsspecifies threads available to each worker (gridding, FFTs, wavelet transforms).
By default a single worker is used for the smallest memory footprint and easy debugging.
Set --nworkers larger than one to use multiple Dask workers for parallel chunk processing.
The product of --nworkers and --nthreads should not exceed available resources.
Package structure
The project follows the hip-cargo src layout:
pfb-imaging/
├── src/pfb_imaging/
│ ├── cli/ # Lightweight CLI wrappers (Typer)
│ ├── core/ # Core implementations (lazy-loaded)
│ ├── cabs/ # Generated Stimela cab definitions (YAML)
│ ├── deconv/ # Deconvolution algorithms
│ ├── operators/ # Mathematical operators (gridding, PSF, Psi)
│ ├── opt/ # Optimization algorithms (PCG, FISTA, primal-dual)
│ ├── prox/ # Proximal operators
│ ├── utils/ # Utility functions
│ └── wavelets/ # Wavelet transform implementations
├── scripts/ # Profiling and automation scripts
├── tests/
├── Dockerfile
└── pyproject.toml
Key separation: CLI modules (cli/) are lightweight with lazy imports so that pfb --help and cab generation don't pull in the full scientific stack.
Core implementations live in core/ and are imported only when a command is executed.
Container images
Container images are published to GitHub Container Registry at ghcr.io/ratt-ru/pfb-imaging.
The full image URL (including tag) is the single source of truth and lives in src/pfb_imaging/_container_image.py as the CONTAINER_IMAGE variable, loaded via importlib (no CWD dependency, no uv sync needed).
CONTAINER_IMAGE = "ghcr.io/ratt-ru/pfb-imaging:<tag>"
The <tag> is managed by three mechanisms:
- Feature branches: the developer manually updates the tag in
_container_image.pyto match the branch name. - Merge to main: the
update-cabs.ymlGitHub Action rewrites the tag tolatest, regenerates cab definitions, and commits the changes. - Releases:
tbumprewrites the tag to the semantic version (e.g.0.0.9) viabefore_commithooks intbump.toml.
Cab definitions are auto-generated with the correct image tag via pre-commit hooks and the update-cabs.yml GitHub Action -- the image URL is read from _container_image.py at generation time, so the --image flag is not needed.
Development
This project uses:
- uv for dependency management
- ruff for linting and formatting (core dependency —
generate-functionrunsruff formatandruff check --fixon generated code) - typer for the CLI
- git-cliff for
CHANGELOGautomation
Setting Up Development Environment
# Clone the repository
git clone https://github.com/ratt-ru/pfb-imaging.git
cd pfb-imaging
# Install dependencies with development tools
uv sync --extra full --group dev --group test
# Install pre-commit hooks (recommended)
uv run pre-commit install --hook-type commit-msg
This will automatically run the hooks before each commit. If any checks fail, the commit will be blocked until you fix the issues.
Running Hooks Manually
You can run the hooks manually on all files:
# Run on all files
uv run pre-commit run --all-files
# Run on staged files only
uv run pre-commit run
Updating Hook Versions
To update hook versions to the latest:
uv run pre-commit autoupdate
Manual Code Quality Checks
If you prefer to run checks manually without pre-commit:
# Format code
uv run ruff format .
# Check and auto-fix linting issues
uv run ruff check . --fix
# Run tests
uv run pytest -v
Commit Message Convention
This project uses Conventional Commits to enable automated changelog generation via git-cliff.
Every commit message should follow this format:
<type>: <description>
[optional body]
Types:
| Type | When to use | Changelog section |
|---|---|---|
feat |
New feature or capability | Added |
fix |
Bug fix | Fixed |
refactor |
Code change that neither fixes a bug nor adds a feature | Changed |
perf |
Performance improvement | Changed |
docs |
Documentation only | Documentation |
test |
Adding or updating tests | Testing |
ci |
CI/CD changes | CI |
deps |
Dependency updates | Dependencies |
chore |
Maintenance tasks (cab regeneration, formatting) | Miscellaneous |
Examples:
git commit -m "feat: add support for MS dtype in type inference"
git commit -m "fix: handle empty docstrings in introspector"
git commit -m "refactor: simplify generate_cabs output formatting"
git commit -m "docs: add container fallback section to README"
git commit -m "test: add roundtrip test for List types"
Scoped commits (optional): Use parentheses to specify the affected component:
git commit -m "feat(init): add --license-type option for BSD-3-Clause"
git commit -m "fix(runner): resolve volume mount for symlinked paths"
Contributing Workflow
-
Create a feature branch:
git checkout -b your-feature-name
-
Update the container image tag in
src/pfb_imaging/_container_image.pyto match your branch name.This ensures the cab definitions generated by pre-commit hooks use the correct branch-specific image tag during development. You do not need to reset the tag before merging — the
update-cabsworkflow handles that automatically on merge tomain. -
Make your changes and ensure tests pass:
uv run pytest -v
-
Commit using conventional commit messages:
git add . git commit -m "feat: your feature description" # Pre-commit hooks run automatically
The pre-commit hooks keep the CLI and corresponding cab definitions in sync, enforce code quality and conventional commits.
-
Push and create a pull request:
git push origin your-feature-name
The GitHub actions workflow automates containerisation by pushing container images to the GitHub Container Registry. Once the PR is merged, they also sync the name of container image corresponding to the branch (i.e. tagged with :latest).
Acknowledgement
If you find any of this useful please cite the pfb-imaging paper.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pfb_imaging-0.0.9.tar.gz.
File metadata
- Download URL: pfb_imaging-0.0.9.tar.gz
- Upload date:
- Size: 169.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ad1121c4d3884a034d95e4b5dad76fb184a3a0aded44eb5b33d8443fce79e1c0
|
|
| MD5 |
4a2a94583fc27c7fbab82525fcf292c7
|
|
| BLAKE2b-256 |
eab59cb6a361e23fca68726cc0dbd9a039c10bb93d317a67d63bc425421532f9
|
Provenance
The following attestation bundles were made for pfb_imaging-0.0.9.tar.gz:
Publisher:
publish.yml on ratt-ru/pfb-imaging
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
pfb_imaging-0.0.9.tar.gz -
Subject digest:
ad1121c4d3884a034d95e4b5dad76fb184a3a0aded44eb5b33d8443fce79e1c0 - Sigstore transparency entry: 1331616616
- Sigstore integration time:
-
Permalink:
ratt-ru/pfb-imaging@43507cba561f646bf3cf10e540336e36e8611cfa -
Branch / Tag:
refs/tags/v0.0.9 - Owner: https://github.com/ratt-ru
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@43507cba561f646bf3cf10e540336e36e8611cfa -
Trigger Event:
push
-
Statement type:
File details
Details for the file pfb_imaging-0.0.9-py3-none-any.whl.
File metadata
- Download URL: pfb_imaging-0.0.9-py3-none-any.whl
- Upload date:
- Size: 234.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
92e2d7e3040f91db82d334862c18782d3b3e277c90f69a697b6d145c09bbd509
|
|
| MD5 |
643036bccd3c3ad9b0d7922edd483a7f
|
|
| BLAKE2b-256 |
5ca147092d06601b233641ecefcf686b504c3eddaac851e464d7370d32885784
|
Provenance
The following attestation bundles were made for pfb_imaging-0.0.9-py3-none-any.whl:
Publisher:
publish.yml on ratt-ru/pfb-imaging
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
pfb_imaging-0.0.9-py3-none-any.whl -
Subject digest:
92e2d7e3040f91db82d334862c18782d3b3e277c90f69a697b6d145c09bbd509 - Sigstore transparency entry: 1331616742
- Sigstore integration time:
-
Permalink:
ratt-ru/pfb-imaging@43507cba561f646bf3cf10e540336e36e8611cfa -
Branch / Tag:
refs/tags/v0.0.9 - Owner: https://github.com/ratt-ru
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@43507cba561f646bf3cf10e540336e36e8611cfa -
Trigger Event:
push
-
Statement type: