Skip to main content

JAX bindings for the Flatiron Institute Nonuniform Fast Fourier Transform library

Project description

JAX bindings to FINUFFT

GitHub Tests Jenkins Tests

This package provides a JAX interface to the Flatiron Institute Non-uniform Fast Fourier Transform (FINUFFT) library. Take a look at the FINUFFT docs for all the necessary definitions, conventions, and more information about the algorithms and their implementation. This package uses a low-level interface to directly expose the FINUFFT library to JAX's XLA backend, as well as implementing differentiation rules for the transforms.

Included features

This library includes CPU and GPU (CUDA) support. GPU support is implemented through the cuFINUFFT interface of the FINUFFT library.

Type 1, 2, and 3 transforms are supported in 1, 2, and 3 dimensions on the CPU and GPU. All of these functions support forward, reverse, and higher-order differentiation, as well as batching using vmap.

The FINUFFT plan interface is not directly exposed, although within a given jax-finufft call, plans are reused where possible, and transforms sharing the same non-uniform points are stacked/vectorized. All of the tuning options one can set in the plan interface are available through the opts argument of the jax-finufft API (see Advanced Usage).

Installation

The easiest way to install jax-finufft is from a pre-compiled binary on PyPI or conda-forge. Only CPU binaries currently are available on PyPI, while conda-forge has both CPU and GPU binaries. If you want GPU support without using conda, you can install jax-finufft from source as detailed below. This is also useful when you want to build finufft optimized for your hardware.

Install binary from PyPI

[!NOTE] Only the CPU-enabled build of jax-finufft is available as a binary wheel on PyPI. For a GPU-enabled build, you'll need to build from source as described below or use conda-forge.

To install a binary wheel from PyPI using uv, run the following command in a venv:

uv pip install jax-finufft

To install with pip instead of uv, simply drop uv from that command.

Install binary from conda-forge

To install a CPU build using mamba (or conda), run:

mamba install -c conda-forge jax-finufft

To install a GPU-enabled build, run:

mamba install -c conda-forge 'jax-finufft=*=cuda*'

Make note of the installed package version, like conda-forge/linux-64::jax-finufft-1.1.0-cuda129py312h8ad7275_1. The cuda129 substring indicates the package was built for CUDA 12.9. Your NVIDIA driver will need to support this version of CUDA. Only one CUDA build per major CUDA version is provided at present.

Install from source

Dependencies

Unsurprisingly, a key dependency is JAX, which can be installed following the directions in the JAX documentation. If you're going to want to run on a GPU, make sure that you install the appropriate JAX build.

The non-Python dependencies that you'll need are:

  • FFTW,
  • OpenMP (for CPU, optional),
  • CUDA (for GPU, optional. We build against CUDA 12 and 13; 11.8 may work, too.)

Below we provide some example workflows for installing the required dependencies:

Install CPU dependencies with mamba or conda
mamba create -n jax-finufft -c conda-forge python jax fftw cxx-compiler
mamba activate jax-finufft
Install GPU dependencies with mamba or conda
mamba create -n gpu-jax-finufft -c conda-forge python fftw cxx-compiler jax 'jaxlib=*=*cuda*'
mamba activate gpu-jax-finufft
mamba install cuda libcufft-static -c nvidia
export CMAKE_PREFIX_PATH=$CONDA_PREFIX:$CMAKE_PREFIX_PATH
Install GPU dependencies using Flatiron module system
ml modules/2.4 \
   gcc \
   python \
   uv \
   fftw \
   cuda/12.8 \
   cudnn/9

export CMAKE_ARGS="$CMAKE_ARGS -DCMAKE_CUDA_ARCHITECTURES=80;90;120 -DJAX_FINUFFT_USE_CUDA=ON"

Other ways of installing JAX are given on the JAX website; the "local CUDA" install methods are preferred for jax-finufft as this ensures the CUDA extensions are compiled with the same Toolkit version as the CUDA runtime. However, in theory, this is not required as long as both JAX and jax-finufft use CUDA with the same major version.

Note that jax is both a build-time and run-time dependency of jax-finufft. If the build-time version of jax is different from the run-time version, you may encounter issues. Most users will not need to worry about this, but if you do, consider installing jax-finufft without build isolation to enforce consistency.

Notes on CUDA versions

While jax-finufft may build with a wide range of CUDA versions, the resulting binaries may not be compatible with JAX (resulting in odd runtime errors, like failed cuDNN or cuBLAS initialization). For the greatest chance of success, we recommend building with the same version as JAX was built with. To discover that, one can look at the requirements in JAX's build directory (be sure to select the git tag for your version of JAX). Similarly, when installing from PyPI, we encourage using jax[cuda12-local] or ``jax[cuda13-local]` so JAX and jax-finufft use the same CUDA libraries. jax-finufft has optional dependencies of the same name for convenience.

Depending on how challenging the installation is, users might want to run jax-finufft in a container. The .devcontainer directory is a good starting point for this.

Configuring the build

There are several important CMake variables that control aspects of the jax-finufft and (cu)finufft builds. These include:

  • JAX_FINUFFT_USE_CUDA [disabled by default]: build with GPU support
  • CMAKE_CUDA_ARCHITECTURES [default native]: the target GPU architecture. native means the GPU arch of the build system.
  • FINUFFT_ARCH_FLAGS [default -march=native]: the target CPU architecture. The default is the native CPU arch of the build system.

Each of these can be set as -Ccmake.define.NAME=VALUE arguments to pip install or uv pip install. For example, to build with GPU support from the repo root, run:

uv pip install -Ccmake.define.JAX_FINUFFT_USE_CUDA=ON .

Use multiple -C arguments to set multiple variables. The -C argument will work with any of the source installation methods (e.g. PyPI source dist, GitHub, pip install, uv pip install, uv sync, etc).

Build options can also be set with the CMAKE_ARGS environment variable. For example:

export CMAKE_ARGS="$CMAKE_ARGS -DJAX_FINUFFT_USE_CUDA=ON"

GPU build configuration

Building with GPU support requires passing JAX_FINUFFT_USE_CUDA=ON to CMake. See Configuring the build.

By default, jax-finufft will build for the GPU of the build machine. If you need to target a different compute capability, such as 8.0 for Ampere, set CMAKE_CUDA_ARCHITECTURES as a CMake define:

uv pip install -Ccmake.define.JAX_FINUFFT_USE_CUDA=ON -Ccmake.define.CMAKE_CUDA_ARCHITECTURES=80 .

CMAKE_CUDA_ARCHITECTURES also takes a semicolon-separated list.

To detect the arch for a specific GPU, one can run:

$ nvidia-smi --query-gpu=compute_cap --format=csv,noheader
8.0

The values are also listed on the NVIDIA website.

In some cases, you may also need the following at runtime:

export LD_LIBRARY_PATH="$CUDA_HOME/extras/CUPTI/lib64:$LD_LIBRARY_PATH"

If CUDA_HOME isn't set, you'll need to replace it with the path to your CUDA installation in the above line, often something like /usr/local/cuda.

Install source from PyPI

The source code for all released versions of jax-finufft are available on PyPI, and this can be installed using:

uv pip install jax-finufft --no-binary jax-finufft

Install source from GitHub

Alternatively, you can check out the source repository from GitHub:

git clone --recurse-submodules https://github.com/flatironinstitute/jax-finufft
cd jax-finufft

[!NOTE] Don't forget the --recurse-submodules argument when cloning the repo because the upstream FINUFFT library is included as a git submodule. If you do forget, you can run git submodule update --init --recursive in your local copy to checkout the submodule after the initial clone.

After cloning the repository, you can install the local copy using the uv "project interface":

uv sync

or using the pip interface:

uv pip install -e .

where the -e flag optionally runs an "editable" install.

As yet another alternative, the latest development version from GitHub can be installed directly (i.e. without cloning first) with

uv pip install git+https://github.com/flatironinstitute/jax-finufft.git

Usage

This library provides three high-level functions (and these should be all that you generally need to interact with): nufft1, nufft2, and nufft3 (for the three "types" of transforms). If you're already familiar with the Python interface to FINUFFT, please note that the function signatures here are different!

For example, here's how you can do a 1-dimensional type 1 transform:

import numpy as np

from jax_finufft import nufft1

M = 100000
N = 200000

rng = np.random.default_rng(123)
x = 2 * np.pi * rng.random(M)
c = rng.standard_normal(M) + 1j * rng.standard_normal(M)
f = nufft1(N, c, x, eps=1e-6, iflag=1)

Noting that the eps and iflag are optional, and that (for good reason, we promise!) the order of the positional arguments is reversed from the finufft Python package.

The syntax for a 2-, or 3-dimensional transform is:

f = nufft1((Nx, Ny), c, x, y)  # 2D
f = nufft1((Nx, Ny, Nz), c, x, y, z)  # 3D

The syntax for a type 2 transform is (also allowing optional iflag and eps parameters):

c = nufft2(f, x)  # 1D
c = nufft2(f, x, y)  # 2D
c = nufft2(f, x, y, z)  # 3D

The syntax for a type 3 transform with "source points" x, y, z and "target points" s, t, u is:

f = nufft3(c, x, s)  # 1D
f = nufft3(c, x, y, s, t)  # 2D
f = nufft3(c, x, y, z, s, t, u)  # 3D

All of these functions support batching using vmap, and forward and reverse mode differentiation.

Stacked Transforms and Broadcasting

A "stacked", or "vectorized", finufft transform is one where the same non-uniform points are reused for multiple sets of source strengths. In the JAX interface, this is achieved by broadcasting. In the following example, only one finufft plan is created and one setpts call made, with a stack of 32 source strengths:

import numpy as np

from jax_finufft import nufft1

M = 100000
N = 200000
S = 32

rng = np.random.default_rng(123)
x = 2 * np.pi * rng.random(M)
c = rng.standard_normal((S, M)) + 1j * rng.standard_normal((S, M))
f = nufft1(N, c, x)

To verify that a stacked transform is being used, see Inspecting the finufft calls.

Note that the broadcasting occurs because an implicit axis of length 1 is inserted in the second-to-last dimension of x. Currently, this is the only style of broadcasting that is supported when the strengths and points have unequal numbers of non-core dimensions. For other styles of broadcasting, insert axes of length 1 into the inputs. Any broadcast axes (even non-consecutive ones) are grouped and stacked in the transform.

Matched, but not broadcast, axes will be executed as separate transforms, each with their own setpts calls (but a single shared plan). In the following example (which continues from the previous), 1 plan is created and 4 setpts and 4 execute calls are made, each executing a stack of 32 transforms:

P = 4

x = 2 * np.pi * rng.random((P, 1, M))
c = rng.standard_normal((P, S, M)) + 1j * rng.standard_normal((P, S, M))
f = nufft1(N, c, x)

Selecting a platform

If you compiled jax-finufft with GPU support, you can force it to use a particular backend by setting the environment variable JAX_PLATFORMS=cpu or JAX_PLATFORMS=cuda.

Advanced usage

Options

The tuning parameters for the library can be set using the opts parameter to nufft1, nufft2, and nufft3. For example, to explicitly set the CPU up-sampling factor that FINUFFT should use, you can update the example from above as follows:

from jax_finufft import options

opts = options.Opts(upsampfac=2.0)
nufft1(N, c, x, opts=opts)

The corresponding option for the GPU is gpu_upsampfac. In fact, all options for the GPU are prefixed with gpu_, with the exception of modeord.

One complication here is that the vector-Jacobian product for a NUFFT requires evaluating a NUFFT of a different type. This means that you might want to separately tune the options for the forward and backward pass. This can be achieved using the options.NestedOpts interface. For example, to use a different up-sampling factor for the forward and backward passes, the code from above becomes:

import jax

opts = options.NestedOpts(
  forward=options.Opts(upsampfac=2.0),
  backward=options.Opts(upsampfac=1.25),
)
jax.grad(lambda args: nufft1(N, *args, opts=opts).real.sum())((c, x))

or, in this case equivalently:

opts = options.NestedOpts(
  type1=options.Opts(upsampfac=2.0),
  type2=options.Opts(upsampfac=1.25),
)

For descriptions of the options, see these pages in the FINUFFT docs:

Inspecting the finufft calls

When evaluating a single NUFFT, it's fairly obvious that jax-finufft will execute one finufft transform under the hood. However, when evaluating a stacked NUFFT, or taking the gradients of a NUFFT, the sequence of calls may be less obvious. One way to inspect exactly what finufft calls are being made is to enable finufft's debug output by passing opts=Opts(debug=True) or opts=Opts(gpu_debug=True).

For example, taking the Stacked Transforms example and enabling debug output, we see the following:

>>> f = nufft1(N, c, x, eps=1e-6, iflag=1, opts=Opts(debug=True))
[FINUFFT_PLAN_T] new plan: FINUFFT version 2.4.1 .................
[FINUFFT_PLAN_T] 1d1: (ms,mt,mu)=(200000,1,1) (nf1,nf2,nf3)=(400000,1,1)
               ntrans=32 nthr=16 batchSize=16  spread_thread=2
[FINUFFT_PLAN_T] kernel fser (ns=7):            0.000765 s
[FINUFFT_PLAN_T] fwBatch 0.05GB alloc:          0.00703 s
[FINUFFT_PLAN_T] FFT plan (mode 64, nthr=16):   0.00892 s
[setpts] sort (didSort=1):              0.00327 s
[execute] start ntrans=32 (2 batches, bsize=16)...
[execute] done. tot spread:             0.0236 s
               tot FFT:                         0.0164 s
               tot deconvolve:                  0.00191 s

Evidently, we are creating a single plan with 32 transforms, and finufft has chosen to batch them into two sets of 16. setpts is only called once, as is execute, as we would expect for a stacked transform.

Notes on the Implementation of the Gradients

The NUFFT gradients are implemented as Jacobian-vector products (JVP, i.e. forward-mode autodiff), with associated transpose rules that implement the vector-Jacobian product (VJP, reverse mode). These are found in ops.py, in the jvp and transpose functions.

The JVP of a D-dimensional type 1 or 2 NUFFT requires D transforms of the same type in D dimensions (considering just the gradients with respect to the non-uniform locations). Each transform is weighted by the frequencies (as a overall scaling for type 1, and at the Fourier strength level for type 2). These transforms are fully stacked, and finufft plans are reused where possible.

Furthermore, the JAX jvp evaluates the function in addition to its JVP, so 1 more transform is necessary. This transform is not stacked with the JVP transforms. Likewise, 1 more is needed when the gradient with respect to the source or Fourier strengths is requested. However, this transform is stacked with the JVP.

In reverse mode, the VJP of a type 1 NUFFT requires type 2 transforms, and type 2 requires type 1. In either case, the function evaluation returned under JAX's vjp still requires an NUFFT of the original type (which cannot be stacked with the VJP transforms, as they are of a different type).

For type 3, the JVP requires 2*D type 3 transforms of dimension D to evaluate the gradients with respect to both the source and target locations. The strengths of each transform are weighted by the source or target locations. The source and target transforms are stacked separately. As with type 1 and 2, the strengths gradient transform is stacked with the source locations and the function evaluation transform is not stacked.

The VJP of a type 3 NUFFT also uses type 3 NUFFTs, but with the source and target points swapped.

In all of the above, whenever a user requests stacked transforms via broadcasting, this does not introduce new plans or finufft calls—the stacks simply get deeper. New sets of non-uniform points necessarily introduce new setpts and new executions, but not new plans.

To see all of the stacking behavior in action, take a look at Inspecting the finufft calls.

Similar libraries

License & attribution

This package, developed by Dan Foreman-Mackey is licensed under the Apache License, Version 2.0, with the following copyright:

Copyright 2021-2026 The Simons Foundation, Inc.

If you use this software, please cite the primary references listed on the FINUFFT docs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jax_finufft-1.3.1.tar.gz (4.7 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

jax_finufft-1.3.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.14tmanylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

jax_finufft-1.3.1-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.14tmanylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

jax_finufft-1.3.1-cp314-cp314t-macosx_14_0_arm64.whl (4.1 MB view details)

Uploaded CPython 3.14tmacOS 14.0+ ARM64

jax_finufft-1.3.1-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

jax_finufft-1.3.1-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

jax_finufft-1.3.1-cp314-cp314-macosx_14_0_arm64.whl (4.1 MB view details)

Uploaded CPython 3.14macOS 14.0+ ARM64

jax_finufft-1.3.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

jax_finufft-1.3.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.13tmanylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

jax_finufft-1.3.1-cp313-cp313t-macosx_14_0_arm64.whl (4.1 MB view details)

Uploaded CPython 3.13tmacOS 14.0+ ARM64

jax_finufft-1.3.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

jax_finufft-1.3.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

jax_finufft-1.3.1-cp313-cp313-macosx_14_0_arm64.whl (4.1 MB view details)

Uploaded CPython 3.13macOS 14.0+ ARM64

jax_finufft-1.3.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

jax_finufft-1.3.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

jax_finufft-1.3.1-cp312-cp312-macosx_14_0_arm64.whl (4.1 MB view details)

Uploaded CPython 3.12macOS 14.0+ ARM64

jax_finufft-1.3.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

jax_finufft-1.3.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

jax_finufft-1.3.1-cp311-cp311-macosx_14_0_arm64.whl (4.1 MB view details)

Uploaded CPython 3.11macOS 14.0+ ARM64

File details

Details for the file jax_finufft-1.3.1.tar.gz.

File metadata

  • Download URL: jax_finufft-1.3.1.tar.gz
  • Upload date:
  • Size: 4.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for jax_finufft-1.3.1.tar.gz
Algorithm Hash digest
SHA256 821d981bf180916ba8c5076bd6e267a4d0e1cb0b01ea60c029859f78db5a55f1
MD5 60934d04e356746652c674a27561463c
BLAKE2b-256 a9d1899fde8e105e1086091789d00f20c56403436a55ec3a4007685fa14a6978

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1.tar.gz:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 5d19ef3c9aee50416a1ce52c34c95ce4b81df0f6080b2bc09c7de3a3e7dd2473
MD5 03b0ffd82d79403c607b7f8702a9ece6
BLAKE2b-256 8d115d18067a50cf9b3488bdbdf160b550f7da78c223ee0548d1f730a9f865c7

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 f2a1fa566aab0a022a5660de55eeec250ea7a9a7414464595052c6f23ef033f0
MD5 eeceb08415c2c7ded3c4e992dc10647c
BLAKE2b-256 8f174b643eb75b5322ce9367ce31cc6efe97cbde685c421f70c6465947a06a24

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp314-cp314t-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp314-cp314t-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 9a2433648a6fef0c3a5e9f21433f63193aae3fb6d83409e20c0db45365387e8f
MD5 2c40f406b86ead4f240559015b035155
BLAKE2b-256 7fe882886b9c04e9ae803dc54089081fec808dafa98c5ac27ceb9fd40f9fc5ae

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp314-cp314t-macosx_14_0_arm64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 4eb133094dfa60ac97caac0c011349b2427350fc3eab5dcd5bb3df1e67149fdd
MD5 3c59f19bcbecfb2a41fb1b379970826b
BLAKE2b-256 2cfaf403fd973989f7298d1ef1509902ca0018193067193920a1cbaa383a5f1f

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 81abea71b4e02d69d40c3f4fd39a0be728513286f8dd2e104aa3fc50fc9bf984
MD5 a82e5c835de0bc40fad184214efc026a
BLAKE2b-256 79ffaec21265aefeef67d72c94d3b2a4c34160b9451051492e8c57ae687ea26b

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp314-cp314-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp314-cp314-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 602c14b8f43766fcdb68424ee50e13fa8c1b2ebd2bf93daa478c0c495fa99f57
MD5 31ef801f6b33fb031c61bf2a2f08f431
BLAKE2b-256 bff8e32ca1557ed27a5d24cbaf7dea1335312ff92c86f9860dc8f10e67326001

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp314-cp314-macosx_14_0_arm64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f6f2e001163d2c482b32b0b0812b7bd1c7471c5babb82d4eea6a20889bff9e10
MD5 8939f829e548eccd4e250633e76eeec8
BLAKE2b-256 8b3f48caaf7565100541fdacaeb73794dd4644ece852ec7611ae8a6580f0f21a

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 6a29f0f32635459e0b1176703eb4849f3a66c32dc2e963faa646b7c120187496
MD5 3ccefb4ba78bcaa25819e719696022ef
BLAKE2b-256 9dc487f6aaf19229b89045a05807c89af178c686dc0ab76cf743ce21ae079568

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp313-cp313t-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp313-cp313t-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 b9de5c22694f4c3072f485efa8e054f458491499bd73d32e8b2177f4b5129010
MD5 ba5cb31df0ca5f441af826cf14699dbd
BLAKE2b-256 abdb6d32f3d2c5f1b7ec24170700296f3b5f3287131c37f5fd0613cb72fa0253

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp313-cp313t-macosx_14_0_arm64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f0725a590132c97370c4904e6f8ebff3190b3d7283e78e52b4d5bb4d182580c0
MD5 08922e9af36b5774e57e46108fddf65e
BLAKE2b-256 705d170f6846802aa1d44d60b0be6f8e15e40a28a087e9bf29b8a73e26f9ec7a

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 3cf6d4d1d86885b9deedb48213cf9b900191d89fc5119cc2cfaeed98e031f1b5
MD5 58cab3a0168653f1a4490c7ca5db9ed4
BLAKE2b-256 4015b8c44cd09993b4ffc931a4e3da75bbf7fe5e714622968f3cd209d0ce3bc0

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp313-cp313-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp313-cp313-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 c516f3ce1fe6c943562a09564343640189d79fb721620952d2dd58a8c03228fe
MD5 b740b0cdd2f3ea534ad13ff8fcd659c3
BLAKE2b-256 51c8f2c8cc6b6e411a31b642b12758122c47578f712d1e68d6e2be1b976a5564

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp313-cp313-macosx_14_0_arm64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 5bfcef3916c0c2c1c9104619bd2ab82727d81b5f61be4a9339e77b672dc21526
MD5 28f6aed196a731510e5a112374b6aa23
BLAKE2b-256 1c8ea47915c9756213d5b589efdf1c47dd0e9035c45f3e62a91043cdcce23b5a

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 77a4c857acfc934dac4164486eadd50f76470f75dc3e489596bd119e18816105
MD5 dd39f7723a5ed53d9bf97fafb171ce15
BLAKE2b-256 b9ae3d73b3cbd311f7d4048c92df7ee621de0627d4f7a5089e62b06430fade9d

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp312-cp312-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp312-cp312-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 91321534f13630fe09a7c1c3ee4fa5073ce1c76d06213209cfaf0b6c3d770b75
MD5 2bd19a48593bb466427f4954d550d82c
BLAKE2b-256 c72e86a85fa54a26b169cada3ec704e124e29e9f950c2c3eb8123918d052d27e

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp312-cp312-macosx_14_0_arm64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 201cc1c906aa52947508ac01f4a9a77fcc0b67074cc952acc0a14b8f4dbb2fca
MD5 4d8b5b8f6bf7f57a2a059d94a66bf750
BLAKE2b-256 dc7a6c6fd526b697092774aaa569ba849c6486d4c043eea5f93eb58fbcebe767

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 e059b8784be9a542310718d8565909ade93025d3dbb3b6a9287527a341432aae
MD5 7baf03c20426e0f4cc1c891c6168541c
BLAKE2b-256 f50f154047dbdc895b8541473d764fbab28719511975da0d43586901a4f56ccf

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jax_finufft-1.3.1-cp311-cp311-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for jax_finufft-1.3.1-cp311-cp311-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 c5c4575f7d526486e87214be2a8c60ee9513ce2815be3dee1aa55aa84aac1205
MD5 3b5326c907c4fec57845c866b48b0477
BLAKE2b-256 1b3db8d6a3db2c6f44ab32759df9b418117301878d9aeca0b65965493266fed9

See more details on using hashes here.

Provenance

The following attestation bundles were made for jax_finufft-1.3.1-cp311-cp311-macosx_14_0_arm64.whl:

Publisher: wheels.yml on flatironinstitute/jax-finufft

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page