PyMPDATA + numba-mpi coupler sandbox
Project description
PyMPDATA-MPI
PyMPDATA-MPI constitutes a PyMPDATA + numba-mpi coupler enabling numerical solutions of transport equations with the MPDATA numerical scheme in a hybrid parallelisation model with both multi-threading and MPI distributed memory communication. PyMPDATA-MPI adapts to API of PyMPDATA offering domain decomposition logic.
Hello world examples
In a minimal setup, PyMPDATA-MPI can be used to solve the following transport equation: $$\partial_t (G \psi) + \nabla \cdot (Gu \psi)= 0$$ in an environment with multiple nodes. Every node (process) is responsible for computing its part of the decomposed domain.
Spherical scenario (2D)
In spherical geometry, the $G$ factor represents the Jacobian of coordinate transformation.
In this example (based on a test case from Williamson & Rasch 1989),
domain decomposition is done cutting the sphere along meridians.
The inner dimension uses the MPIPolar
boundary condition class, while the outer dimension uses
MPIPeriodic
.
Note that the spherical animations below depict simulations without MPDATA corrective iterations,
i.e. only plain first-order upwind scheme is used (FIX ME).
1 worker
2 workers
Cartesian scenario (2D)
In the cartesian example below (based on a test case from Arabas et al. 2014), a constant advector field $u$ is used (and $G=1$). MPI (Message Passing Interface) is used for handling data transfers and synchronisation with the domain decomposition across MPI workers done in either inner or in the outer dimension (user setting). Multi-threading (using, e.g., OpenMP via Numba) is used for shared-memory parallelisation within subdomains with further subdomain split across the inner dimension (PyMPDATA logic). In this example, two corrective MPDATA iterations are employed.
1 worker
2 workers
3 workers
Package architecture
flowchart BT
H5PY ---> HDF{{HDF5}}
subgraph pythonic-dependencies [Python]
TESTS --> H[pytest-mpi]
subgraph PyMPDATA-MPI ["PyMPDATA-MPI"]
TESTS["PyMPDATA-MPI[tests]"] --> CASES(simulation scenarios)
A1["PyMPDATA-MPI[examples]"] --> CASES
CASES --> D[PyMPDATA-MPI]
end
A1 ---> C[py-modelrunner]
CASES ---> H5PY[h5py]
D --> E[numba-mpi]
H --> X[pytest]
E --> N
F --> N[Numba]
D --> F[PyMPDATA]
end
H ---> MPI
C ---> slurm{{slurm}}
N --> OMPI{{OpenMP}}
N --> L{{LLVM}}
E ---> MPI{{MPI}}
HDF --> MPI
slurm --> MPI
style D fill:#7ae7ff,stroke-width:2px,color:#2B2B2B
click H "https://pypi.org/p/pytest-mpi"
click X "https://pypi.org/p/pytest"
click F "https://pypi.org/p/PyMPDATA"
click N "https://pypi.org/p/numba"
click C "https://pypi.org/p/py-modelrunner"
click H5PY "https://pypi.org/p/h5py"
click E "https://pypi.org/p/numba-mpi"
click A1 "https://pypi.org/p/PyMPDATA-MPI"
click D "https://pypi.org/p/PyMPDATA-MPI"
click TESTS "https://pypi.org/p/PyMPDATA-MPI"
Rectangular boxes indicate pip-installable Python packages (click to go to pypi.org package site).
Credits:
Development of PyMPDATA-MPI has been supported by the Poland's National Science Centre
(grant no. 2020/39/D/ST10/01220).
copyright: Jagiellonian University & AGH University of Krakow
licence: GPL v3
Design goals
- MPI support for PyMPDATA implemented externally (i.e., not incurring any overhead or additional dependencies for PyMPDATA users)
- MPI calls within Numba njitted code (hence not using
mpi4py
, but leveragingnumba-mpi
) - hybrid domain decomposition parallelisation: threading (internal in PyMPDATA, in the inner dimension) + MPI (either inner or outer dimension)
- portability across major OSes (currently Linux & macOS; no Windows support due challenges in getting HDF5/MPI-IO to work there)
- full test coverage including CI builds asserting on same results with multi-node vs. single-node computations
- Continuous Integration with different OSes and different MPI implementation
Related resources
open-source Large-Eddy-Simulation and related software
Julia
C++
C/CUDA
FORTRAN
- https://github.com/dalesteam/dales
- https://github.com/uclales/uclales
- https://github.com/UCLALES-SALSA/UCLALES-SALSA
- https://github.com/igfuw/bE_SDs
- https://github.com/pencil-code/pencil-code
- https://github.com/AtmosFOAM/AtmosFOAM
- https://github.com/scale-met/scale
Python (incl. Cython)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for PyMPDATA_MPI-0.0.7-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3423ad99ff02b743eb6d319b7c1d9b66a980f53571ddd04809b7a4d797017734 |
|
MD5 | 51601be6a9e297f1757e212a54204609 |
|
BLAKE2b-256 | 0b00ed62b81d7c42a925e67f6e0c49ec692ec811f76982678131eee1310b60b1 |