Efficient finite-difference solver for the Reynolds equation in thin fluid films
Project description
Finite-Difference Reynolds Fluid Solver
Description
An efficient Python code to solve the diffusion equation on Cartesian and polar grids:
- Reynolds equation:
\nabla\cdot\left(g^3\nabla p\right)=0
- Inlet/outlet pressure:
p(x=0)=1,\quad p(x=1)=0
- Periodic boundary conditions:
p(y=0)=p(y=1),\quad \frac{\partial p}{\partial y}(x,y=0)=\frac{\partial p}{\partial y}(x,y=1)
What can it do?
- It takes as input a gap field $g$.
- It analyzes its connectivity and removes isolated islands and checks for percolation (whether a flow problem can be solved).
- It dilates non-zero gap field to properly handle impenetrability of channels, it allows not to erode the domain for flux calculation.
- It applies an inlet pressure $p_i=1$ on one side $x=0$ and an outlet pressure $p_0=0$ on the opposite side $x=1$ and uses periodic boundary conditions on the lateral sides $y={0,1}$.
- It constructs a sparse matrix with conductivity proportional to $g^3$.
- Different solvers (direct and iterative with appropriate preconditioners) are selected and tuned to solve efficiently the resulting linear system of equations.
- Total flux is properly computed.
Usage
- Install the package
pip install reynoldsflow
The default solver is scipy.amg-rs (SciPy CG + Ruge-Stüben AMG via pyamg), included in the base install.
For optional high-performance solvers (PARDISO, PETSc, CHOLMOD):
pip install reynoldsflow[solvers]
With [solvers], solver="auto" selects petsc-cg.hypre as the fastest option.
- Run a minimal example (flow around a circular inclusion)
import numpy as np
import matplotlib.pyplot as plt
from reynoldsflow import transport as FS
n = 100
X, Y = np.meshgrid(np.linspace(0, 1, n), np.linspace(0, 1, n))
gaps = (np.sqrt((X - 0.5)**2 + (Y - 0.5)**2) > 0.2).astype(float)
_, _, flux = FS.solve_fluid_problem(gaps, solver="scipy.amg-rs")
if flux is not None:
plt.imshow(np.sqrt(flux[:, :, 0]**2 + flux[:, :, 1]**2),
origin='lower', cmap='jet')
plt.show()
- Run the test suite
python -m pytest -q
- Or run these tests manually:
- Solves the flux evolution problem:
/tests/test_evolution.py - Solves flux problem on a Cartesian grid:
/tests/test_solve.py - Solves flux problem on a polar grid:
/tests/polar_flow.py - Tests all solvers:
/tests/test_solvers.py.
Available Solvers and Preconditioners
The fluid flow solver supports several linear system solvers and preconditioners for efficient and robust solution of large sparse systems:
| Solver String | Solver Type | Preconditioner | Backend | Description |
|---|---|---|---|---|
pardiso |
Direct | - | Intel MKL | 🥇PARDISO direct solver. The fastest for bigger problems, but consumes a lot of memory. |
petsc-cg.hypre |
Iterative (CG) | HYPRE | PETSc | 🥇 CG with HYPRE BoomerAMG. The fastest for moderate problems. |
scipy.amg-rs |
Iterative (CG) | AMG (Ruge-Stuben) | SciPy/PyAMG | CG with Ruge-Stuben AMG. Only two times slower than the fastest. |
scipy.amg-smooth_aggregation |
Iterative (CG) | AMG (Smoothed Aggregation) | SciPy/PyAMG | CG with Smoothed Aggregation AMG. Memory efficient, but relatively slow. |
cholesky |
Direct | - | scikit-sparse | CHOLMOD Cholesky decomposition. Slightly lower memory consumption for huge problems, but it is slow. |
petsc-cg.gamg |
Iterative (CG) | GAMG | PETSc | CG with Geometric Algebraic Multigrid. Not very reliable in performance, 2-3 times slower than the fastest solver. |
petsc-mumps |
Direct | - | PETSc/MUMPS | MUMPS direct solver via PETSc. For moderate problems, five times slower than the fastest solver. |
petsc-cg.ilu |
Iterative (CG) | ILU | PETSc | CG with Incomplete LU factorization. The slowest. |
Relevant CPU times for a relatively small problem with $N\times N = 2000\times 2000$ grid points (relative tolerance for iterative solvers was set to 1e-8).
| Solver | CPU time (s) |
|---|---|
| petsc-cg.hypre | 4.46 |
| pardiso | 8.53 |
| scipy.amg-rs | 8.96 |
| petsc-cg.gamg | 11.96 |
| scipy.amg-smooth_aggregation | 15.48 |
| cholesky | 20.61 |
| petsc-mumps | 26.14 |
| petsc-cg.ilu | 134.98 |
Rules of thumb:
- For fastest computation: use
pardiso(consumes a lot of memory) orpetsc-cg.hypre(the only difficulty is to install PETSc); - For best memory efficiency: use
scipy.amg-rs; - For small-scale problems (for $N<2000$): use
pardiso; - For large-scale problems (for $N>2000$): use
petsc-cg.hypre; - Avoid
petsc-cg.ilu.
The most reliable solvers for big problems are petsc-cg.hypre and pardiso. Here are the test data obtained on rough "contact" problems on Intel(R) Xeon(R) Platinum 8488C. Only solver's time is shown (relative tolerance for PETSc-CG.Hypre was set to 1e-8).
| N | CPU time (s) | |
|---|---|---|
| PETSc-CG.Hypre | Intel MKL Pardiso | |
| 20 000 | 1059.22 | ∅ |
| 10 000 | 278.18 | 112.38 |
| 5 000 | 70.62 | 28.42 |
| 2 500 | 17.72 | 6.34 |
| 1 250 | 4.47 | 1.93 |
∅ $-$ pardiso could not run as it required more than 256 GB or memory.
CPU/RAM Performance
Performance of the code on a truncated rough surface is shown below. The peak memory consumption and the CPU time required to perform connectivity analysis, constructing the matrix and solving the linear system are provided. The real number of DOFs is reported which corresponds to approximately 84% of the square grid $N\times N$ for $N\in{500, 1,000, 2,000, 4,000, 6,000, 8,000, 16,000}$ (relative tolerance for iterative solvers was set to 1e-8).
Illustration
An example of a fluid flow simulation, solved on the grid $N\times N = 8,000 \times 8,000$ which features a truncated self-affine rough surface with a rich spectrum. Solution time on my laptop with petsc is only 97 seconds and the peak memory consumption is 25.8 GB.
Another example for a grid $N\times N = 20,000 \times 20,000$. Simulation time (sequential) $\approx 17$ minutes on Intel(R) Xeon(R) Platinum 8488C with the peak memory below 230 GB with petsc-cg.gamg solver.
Info
- Author: Vladislav A. Yastrebov (CNRS, Mines Paris - PSL)
- AI usage: Cursor & Copilot (different models), ChatGPT 4o, 5, Claude Sonnet 3.7, 4, 4.5
- License: BSD 3-clause
- Date: Sept-Nov 2025
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file reynoldsflow-0.0.2.tar.gz.
File metadata
- Download URL: reynoldsflow-0.0.2.tar.gz
- Upload date:
- Size: 25.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2591e4c53eec693803ab7bc02b5bcacd24782cdffbf7ab09e5c6cba33cf13cec
|
|
| MD5 |
87d3f93a71634f6ba5d13a9e3e4fce07
|
|
| BLAKE2b-256 |
43b4de7da483cf1c713a38c46ddfb210a3a5c773829649831a94417420fa1da8
|
File details
Details for the file reynoldsflow-0.0.2-py3-none-any.whl.
File metadata
- Download URL: reynoldsflow-0.0.2-py3-none-any.whl
- Upload date:
- Size: 20.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
838142bab70acd2b2fa091e965a295a0ca1f681067a894ea9bc943340d7c0b97
|
|
| MD5 |
6fa1a089e500cd233f2f560291697093
|
|
| BLAKE2b-256 |
065036f8916d99717533a0befd0e0e280e90b7adcd459b9f8026296b3af2378d
|