Skip to main content

Efficient differentiable PDE solvers in JAX.

Project description

Efficient Differentiable n-d PDE solvers built on top of JAX & Equinox.

InstallationDocumentationQuickstartFeaturesBackgroundAcknowledgements

Exponax is a suite for building Fourier spectral ETDRK time-steppers for semi-linear PDEs in 1d, 2d, and 3d. There are many pre-built dynamics and plenty of helpful utilities. It is extremely efficient, is differentiable (due to being fully written in JAX), and embeds seamlessly into deep learning.

Installation

pip install exponax

Requires Python 3.10+ and JAX 0.4.13+. 👉 JAX install guide.

Documentation

Documentation is available at fkoehler.site/exponax.

Quickstart

1d Kuramoto-Sivashinsky Equation.

import jax
import exponax as ex
import matplotlib.pyplot as plt

ks_stepper = ex.stepper.KuramotoSivashinskyConservative(
    num_spatial_dims=1, domain_extent=100.0,
    num_points=200, dt=0.1,
)

u_0 = ex.ic.RandomTruncatedFourierSeries(
    num_spatial_dims=1, cutoff=5
)(num_points=200, key=jax.random.PRNGKey(0))

trajectory = ex.rollout(ks_stepper, 500, include_init=True)(u_0)

plt.imshow(trajectory[:, 0, :].T, aspect='auto', cmap='RdBu', vmin=-2, vmax=2, origin="lower")
plt.xlabel("Time"); plt.ylabel("Space"); plt.show()

For a next step, check out this tutorial on 1D Advection that explains the basics of Exponax.

Features

  1. JAX as the computational backend:
    1. Backend agnotistic code - run on CPU, GPU, or TPU, in both single and double precision.
    2. Automatic differentiation over the timesteppers - compute gradients of solutions with respect to initial conditions, parameters, etc.
    3. Also helpful for tight integration with Deep Learning since each timestepper is just an Equinox Module.
    4. Automatic Vectorization using jax.vmap (or equinox.filter_vmap) allowing to advance multiple states in time or instantiate multiple solvers at a time that operate efficiently in batch.
  2. Lightweight Design without custom types. There is no grid or state object. Everything is based on JAX arrays. Timesteppers are callable PyTrees.
  3. More than 46 pre-built dynamics across 1d, 2d, and 3d:
    1. Linear PDEs (advection, diffusion, dispersion, etc.)
    2. Nonlinear PDEs (Burgers, Kuramoto-Sivashinsky, Korteweg-de Vries, Navier-Stokes, etc.)
    3. Reaction-Diffusion (Gray-Scott, Swift-Hohenberg, etc.)
  4. Collection of initial condition distributions (truncated Fourier series, Gaussian Random Fields, etc.)
  5. Utilities for spectral derivatives, grid creation, autogressive rollout, interpolation, etc.
  6. Easily extendable to new PDEs by subclassing from the BaseStepper module.
  7. An alternative, reduced interface allowing to define PDE dynamics using normalized or difficulty-based idenfitiers.

Background

Exponax supports the efficient solution of (semi-linear) partial differential equations on periodic domains in arbitrary dimensions. Those are PDEs of the form

$$ \partial u/ \partial t = Lu + N(u), $$

where $L$ is a linear differential operator and $N$ is a nonlinear differential operator. The linear part can be exactly solved using a (matrix) exponential, and the nonlinear part is approximated using Runge-Kutta methods of various orders. These methods have been known in various disciplines in science for a long time and have been unified for a first time by Cox & Matthews [1]. In particular, this package uses the complex contour integral method of Kassam & Trefethen [2] for numerical stability. The package is restricted to the original first, second, third and fourth order method. A recent study by Montanelli & Bootland [3] showed that the original ETDRK4 method is still one of the most efficient methods for these types of PDEs.

We focus on periodic domains on scaled hypercubes with a uniform Cartesian discretization. This allows using the Fast Fourier Transform resulting in blazing fast simulations. For example, a dataset of trajectories for the 2d Kuramoto-Sivashinsky equation with 50 initial conditions over 200 time steps with a 128x128 discretization is created in less than a second on a modern GPU.

[1] Cox, Steven M., and Paul C. Matthews. "Exponential time differencing for stiff systems." Journal of Computational Physics 176.2 (2002): 430-455.

[2] Kassam, A.K. and Trefethen, L.N., 2005. Fourth-order time-stepping for stiff PDEs. SIAM Journal on Scientific Computing, 26(4), pp.1214-1233.

[3] Montanelli, Hadrien, and Niall Bootland. "Solving periodic semilinear stiff PDEs in 1D, 2D and 3D with exponential integrators." Mathematics and Computers in Simulation 178 (2020): 307-327.

Acknowledgements

Related & Motivation

This package is greatly inspired by the chebfun library in MATLAB, in particular the spinX (Stiff Pde INtegrator in X dimensions) module within it. These MATLAB utilties have been used extensively as a data generator in early works for supervised physics-informed ML, e.g., the DeepHiddenPhysics and Fourier Neural Operators (the links show where in their public repos they use the spinX module). The approach of pre-sampling the solvers, writing out the trajectories, and then using them for supervised training worked for these problems, but of course limits the scope to purely supervised problem. Modern research ideas like correcting coarse solvers (see for instance the Solver-in-the-Loop paper or the ML-accelerated CFD paper) require a coarse solvers to be differentiable. Some ideas of diverted chain training also requires the fine solver to be differentiable. Even for applications without differentiable solvers, we still have the interface problem with legacy solvers (like the MATLAB ones). Hence, we cannot easily query them "on-the-fly" for sth like active learning tasks, nor do they run efficiently on hardward accelerators (GPUs, TPUs, etc.). Additionally, they were not designed with batch execution (in the sense of vectorized application) in mind which we get more or less for free by jax.vmap. With the reproducible randomness of JAX we might not even have to ever write out a dataset and can re-create it in seconds!

This package also took much inspiration from the FourierFlows.jl in the Julia ecosystem, especially for checking the implementation of the contour integral method of [2] and how to handle (de)aliasing.

Citation

This package was developed as part of the APEBench paper (accepted at Neurips 2024), we will soon add the citation here.

Funding

The main author (Felix Koehler) is a PhD student in the group of Prof. Thuerey at TUM and his research is funded by the Munich Center for Machine Learning.

License

MIT, see here


fkoehler.site  ·  GitHub @ceyron  ·  X @felix_m_koehler  ·  LinkedIn Felix Köhler

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

exponax-0.1.0.tar.gz (97.1 kB view details)

Uploaded Source

Built Distribution

exponax-0.1.0-py3-none-any.whl (145.4 kB view details)

Uploaded Python 3

File details

Details for the file exponax-0.1.0.tar.gz.

File metadata

  • Download URL: exponax-0.1.0.tar.gz
  • Upload date:
  • Size: 97.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for exponax-0.1.0.tar.gz
Algorithm Hash digest
SHA256 25acdb5c1b76f5706316750a3133f427f0faec441a1ffe3b90697d5f32abb5e7
MD5 cc8916b09adb9d862e00b6e5f1533f20
BLAKE2b-256 eedd8e48b06b76e8cdb1a6400f31f4d26e876c86560a19b52f492d032cb285e7

See more details on using hashes here.

File details

Details for the file exponax-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: exponax-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 145.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for exponax-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a8033244769c2cb126a700aa9f39d4d793ba25ed2c9d0bc1fa81c96ba277b770
MD5 3987930d2ff32222c8595b8fac9f6fbe
BLAKE2b-256 9fa3481311930def9fe06b53694afb0172b15d3134cbc38929978e012d5165f8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page