Skip to main content

Pure JAX implementation of Non-Uniform FFT

Project description

nufftax logo

Pure JAX implementation of the Non-Uniform Fast Fourier Transform (NUFFT)

CI Documentation Python 3.12+ License: MIT


MRI reconstruction example

Why nufftax?

A JAX package for NUFFT already exists: jax-finufft. However, it wraps the C++ FINUFFT library via Foreign Function Interface (FFI), exposing it through custom XLA calls. This approach can lead to:

  • Kernel fusion issues on GPU — custom XLA calls act as optimization barriers, preventing XLA from fusing operations
  • CUDA version matching — GPU support requires matching CUDA versions between JAX and the library

nufftax takes a different approach — pure JAX implementation:

  • Fully differentiable — gradients w.r.t. both values and sample locations
  • Pure JAX — works with jit, grad, vmap, jvp, vjp with no FFI barriers
  • GPU ready — runs on CPU/GPU without code changes, benefits from XLA fusion
  • Pallas GPU kernels — fused Triton spreading kernels with 5-75x speedups on A100/H100
  • All NUFFT types — Type 1, 2, 3 in 1D, 2D, 3D

JAX Transformation Support

Transform jit grad/vjp jvp vmap
Type 1 (1D/2D/3D)
Type 2 (1D/2D/3D)
Type 3 (1D/2D/3D)

Differentiable inputs:

  • Type 1: grad w.r.t. c (strengths) and x, y, z (coordinates)
  • Type 2: grad w.r.t. f (Fourier modes) and x, y, z (coordinates)
  • Type 3: grad w.r.t. c (strengths), x, y, z (source coordinates), and s, t, u (target frequencies)

GPU Acceleration

On GPU, nufftax automatically dispatches spreading and interpolation to fused Pallas (Triton) kernels when the problem is large enough. This avoids materializing O(M × nspread^d) intermediate tensors and uses atomic scatter-add for spreading.

Operation Backend Speedup vs pure JAX
1D spread A100 5–67x (M ≥ 100K)
1D spread H100 4–75x (M ≥ 100K)
2D spread A100/H100 2–3x (M ≥ 100K)

The dispatch is transparent — no code changes required. On CPU or for small problems, the pure JAX path is used.

Installation

CPU only:

uv pip install nufftax

With CUDA 12 GPU support:

uv pip install "nufftax[cuda12]"

Development install (from source):

git clone https://github.com/GragasLab/nufftax.git
cd nufftax
uv pip install -e ".[dev]"

This installs test dependencies (pytest, ruff, finufft for comparison testing, pre-commit).

Development install with CUDA 12:

uv pip install -e ".[dev,cuda12]"

With docs dependencies:

uv pip install -e ".[docs]"

Quick Example

import jax
import jax.numpy as jnp
from nufftax import nufft1d1

# Irregular sample locations in [-pi, pi)
x = jnp.array([0.1, 0.7, 1.3, 2.1, -0.5])
c = jnp.array([1.0+0.5j, 0.3-0.2j, 0.8+0.1j, 0.2+0.4j, 0.5-0.3j])

# Compute Fourier modes
f = nufft1d1(x, c, n_modes=32, eps=1e-6)

# Differentiate through the transform
grad_c = jax.grad(lambda c: jnp.sum(jnp.abs(nufft1d1(x, c, n_modes=32)) ** 2))(c)

Documentation

Read the full documentation →

License

MIT. Algorithm based on FINUFFT by the Flatiron Institute.

Citation

If you use nufftax in your research, please cite:

@software{nufftax,
  author = {Gragas and Oudoumanessah, Geoffroy and Iollo, Jacopo},
  title = {nufftax: Pure JAX implementation of the Non-Uniform Fast Fourier Transform},
  url = {https://github.com/GragasLab/nufftax},
  year = {2026}
}

@article{finufft,
  author = {Barnett, Alexander H. and Magland, Jeremy F. and af Klinteberg, Ludvig},
  title = {A parallel non-uniform fast Fourier transform library based on an ``exponential of semicircle'' kernel},
  journal = {SIAM J. Sci. Comput.},
  volume = {41},
  number = {5},
  pages = {C479--C504},
  year = {2019}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nufftax-0.4.0.tar.gz (54.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nufftax-0.4.0-py3-none-any.whl (36.0 kB view details)

Uploaded Python 3

File details

Details for the file nufftax-0.4.0.tar.gz.

File metadata

  • Download URL: nufftax-0.4.0.tar.gz
  • Upload date:
  • Size: 54.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nufftax-0.4.0.tar.gz
Algorithm Hash digest
SHA256 3d54f25c28e162b5838341d1f86d6acb5baca10b607eaee950ce71b0313a5f19
MD5 ac8b0ade9f68630fef80d71aeaea0810
BLAKE2b-256 f5af0462d99bc55244390e69b940049fb86289c441bfae847776b36e8999ee0c

See more details on using hashes here.

Provenance

The following attestation bundles were made for nufftax-0.4.0.tar.gz:

Publisher: release.yml on GragasLab/nufftax

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file nufftax-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: nufftax-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 36.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nufftax-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8097a93d10ddb404037dc52664793cc4304ff1dfc7cf4022e0d80d109b4e2ed9
MD5 2e307a12078a4c1cf51451c25027dc87
BLAKE2b-256 6bcb7fdd52a87aa0cf3f5f3845a6e210bafede2cedf763e6e9eba83937420559

See more details on using hashes here.

Provenance

The following attestation bundles were made for nufftax-0.4.0-py3-none-any.whl:

Publisher: release.yml on GragasLab/nufftax

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page