PyMPDATA + numba-mpi coupler sandbox
Project description
PyMPDATA-MPI
PyMPDATA-MPI constitutes a PyMPDATA + numba-mpi coupler enabling numerical solutions of transport equations with the MPDATA numerical scheme in a hybrid parallelisation model with both multi-threading and MPI distributed memory communication. PyMPDATA-MPI adapts to API of PyMPDATA offering domain decomposition logic.
Hello world examples
In a minimal setup, PyMPDATA-MPI can be used to solve the following transport equation: $$\partial_t (G \psi) + \nabla \cdot (Gu \psi)= 0$$ in an environment with multiple nodes. Every node (process) is responsible for computing its part of the decomposed domain.
Spherical scenario (2D)
In spherical geometry, the $G$ factor represents the Jacobian of coordinate transformation.
In this example (based on a test case from Williamson & Rasch 1989),
domain decomposition is done cutting the sphere along meridians.
The inner dimension uses the MPIPolar
boundary condition class, while the outer dimension uses
MPIPeriodic
.
Note that the spherical animations below depict simulations without MPDATA corrective iterations,
i.e. only plain first-order upwind scheme is used (FIX ME).
1 worker
2 workers
Cartesian scenario (2D)
In the cartesian example below (based on a test case from Arabas et al. 2014), a constant advector field $u$ is used (and $G=1$). MPI (Message Passing Interface) is used for handling data transfers and synchronisation in the outer dimension, while multi-threading (using, e.g., OpenMP via Numba) is used in the inner dimension. In this example, two corrective MPDATA iterations are employed.
1 worker
2 workers
3 workers
4 workers
Package architecture
flowchart BT
H5PY ---> HDF{{HDF5}}
subgraph pythonic-dependencies [Python]
TESTS --> H[pytest-mpi]
subgraph PyMPDATA-MPI ["PyMPDATA-MPI"]
TESTS["PyMPDATA-MPI[tests]"] --> CASES(simulation scenarios)
A1["PyMPDATA-MPI[examples]"] --> CASES
CASES --> D[PyMPDATA-MPI]
end
A1 ---> C[py-modelrunner]
CASES ---> H5PY[h5py]
D --> E[numba-mpi]
H --> X[pytest]
E --> N
F --> N[Numba]
D --> F[PyMPDATA]
end
H ---> MPI
C ---> slurm{{slurm}}
N --> OMPI{{OpenMP}}
N --> L{{LLVM}}
E ---> MPI{{MPI}}
HDF --> MPI
slurm --> MPI
style D fill:#7ae7ff,stroke-width:2px,color:#2B2B2B
click H "https://pypi.org/p/pytest-mpi"
click X "https://pypi.org/p/pytest"
click F "https://pypi.org/p/PyMPDATA"
click N "https://pypi.org/p/numba"
click C "https://pypi.org/p/py-modelrunner"
click H5PY "https://pypi.org/p/h5py"
click E "https://pypi.org/p/numba-mpi"
click A1 "https://pypi.org/p/PyMPDATA-MPI"
click D "https://pypi.org/p/PyMPDATA-MPI"
click TESTS "https://pypi.org/p/PyMPDATA-MPI"
Rectangular boxes indicate pip-installable Python packages (click to go to pypi.org package site).
Credits:
Development of PyMPDATA-MPI has been supported by the Poland's National Science Centre
(grant no. 2020/39/D/ST10/01220).
copyright: Jagiellonian University & AGH University of Krakow
licence: GPL v3
Design goals
- MPI support for PyMPDATA implemented externally (i.e., not incurring any overhead or additional dependencies for PyMPDATA users)
- MPI calls within Numba njitted code (hence not using
mpi4py
, but leveragingnumba-mpi
) - hybrid threading (internal in PyMPDATA, in the inner dimension) + MPI (outer dimension) parallelisation
- portability across major OSes (currently Linux & macOS; no Windows support due challenges in getting HDF5/MPI-IO to work there)
- full test coverage including CI builds asserting on same results with multi-node vs. single-node computations
- Continuous Integration with different OSes and different MPI implementation
Related resources
open-source Large-Eddy-Simulation and related software
Julia
C++
C/CUDA
FORTRAN
- https://github.com/dalesteam/dales
- https://github.com/uclales/uclales
- https://github.com/UCLALES-SALSA/UCLALES-SALSA
- https://github.com/igfuw/bE_SDs
- https://github.com/pencil-code/pencil-code
- https://github.com/AtmosFOAM/AtmosFOAM
- https://github.com/scale-met/scale
Python/Cython/C
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for PyMPDATA_MPI-0.0.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9f01af21154af286ec03389782a6de338b7395ac487323044f708817a87729a4 |
|
MD5 | 5460204d256c63d14ecc9b865c17252f |
|
BLAKE2b-256 | ae7234ff9037b078c14fb9cbd08cc0da252abef6c5b47046e6d2b6f42278c733 |