LagrangeBench: A Lagrangian Fluid Mechanics Benchmarking Suite
Project description
Table of Contents
Installation
Standalone library
Install the core lagrangebench
library from PyPi as
pip install lagrangebench --extra-index-url=https://download.pytorch.org/whl/cpu
Note that by default lagrangebench
is installed without JAX GPU support. For that follow the instructions in the GPU support section.
Clone
Clone this GitHub repository
git clone https://github.com/tumaer/lagrangebench.git
cd lagrangebench
Install the dependencies with Poetry (>=1.6.0)
poetry install --only main
Alternatively, a requirements file is provided. It directly installs the CUDA version of JAX.
pip install -r requirements_cuda.txt
For a CPU version of the requirements file, one could use docs/requirements.txt
.
GPU support
To run JAX on GPU, follow the Jax CUDA guide, or in general run
pip install --upgrade jax[cuda11_pip]==0.4.20 -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
# or, for cuda 12
pip install --upgrade jax[cuda12_pip]==0.4.20 -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
MacOS
Currently, only the CPU installation works. You will need to change a few small things to get it going:
- Clone installation: in
pyproject.toml
change the torch version from2.1.0+cpu
to2.1.0
. Then, remove thepoetry.lock
file and runpoetry install --only main
. - Configs: You will need to set
f64: False
andnum_workers: 0
in theconfigs/
files.
Although the current jax-metal==0.0.5
library supports jax in general, there seems to be a missing feature used by jax-md
related to padding -> see this issue.
Usage
Standalone benchmark library
A general tutorial is provided in the example notebook "Training GNS on the 2D Taylor Green Vortex" under ./notebooks/tutorial.ipynb
on the LagrangeBench repository. The notebook covers the basics of LagrangeBench, such as loading a dataset, setting up a case, training a model from scratch and evaluating its performance.
Running in a local clone (main.py
)
Alternatively, experiments can also be set up with main.py
, based on extensive YAML config files and cli arguments (check configs/
). By default, the arguments have priority as: 1) passed cli arguments, 2) YAML config and 3) defaults.py
(lagrangebench
defaults).
When loading a saved model with --model_dir
the config from the checkpoint is automatically loaded and training is restarted. For more details check the experiments/
directory and the run.py
file.
Train
For example, to start a GNS run from scratch on the RPF 2D dataset use
python main.py --config configs/rpf_2d/gns.yaml
Some model presets can be found in ./configs/
.
If --mode=all
, then training (--mode=train
) and subsequent inference (--mode=infer
) on the test split will be run in one go.
Restart training
To restart training from the last checkpoint in --model_dir
use
python main.py --model_dir ckp/gns_rpf2d_yyyymmdd-hhmmss
Inference
To evaluate a trained model from --model_dir
on the test split (--test
) use
python main.py --model_dir ckp/gns_rpf2d_yyyymmdd-hhmmss/best --rollout_dir rollout/gns_rpf2d_yyyymmdd-hhmmss/best --mode infer --test
If the default --out_type_infer=pkl
is active, then the generated trajectories and a metricsYYYY_MM_DD_HH_MM_SS.pkl
file will be written to the --rollout_dir
. The metrics file contains all --metrics_infer
properties for each generated rollout.
Datasets
The datasets are hosted on Zenodo under the DOI: 10.5281/zenodo.10021925. When creating a new dataset instance, the data is automatically downloaded. Alternatively, to manually download them use the download_data.sh
shell script, either with a specific dataset name or "all". Namely
- Taylor Green Vortex 2D:
bash download_data.sh tgv_2d datasets/
- Reverse Poiseuille Flow 2D:
bash download_data.sh rpf_2d datasets/
- Lid Driven Cavity 2D:
bash download_data.sh ldc_2d datasets/
- Dam break 2D:
bash download_data.sh dam_2d datasets/
- Taylor Green Vortex 3D:
bash download_data.sh tgv_3d datasets/
- Reverse Poiseuille Flow 3D:
bash download_data.sh rpf_3d datasets/
- Lid Driven Cavity 3D:
bash download_data.sh ldc_3d datasets/
- All:
bash download_data.sh all datasets/
Notebooks
We provide three notebooks that show LagrangeBench functionalities, namely:
tutorial.ipynb
, with a general overview of LagrangeBench library, with training and evaluation of a simple GNS model,datasets.ipynb
, with more details and visualizations on the datasets, andgns_data.ipynb
, showing how to train models within LagrangeBench on the datasets from the paper Learning to Simulate Complex Physics with Graph Networks.
Directory structure
📦lagrangebench
┣ 📂case_setup # Case setup manager
┃ ┣ 📜case.py # CaseSetupFn class
┃ ┣ 📜features.py # Feature extraction
┃ ┗ 📜partition.py # Alternative neighbor list implementations
┣ 📂data # Datasets and dataloading utils
┃ ┣ 📜data.py # H5Dataset class and specific datasets
┃ ┗ 📜utils.py
┣ 📂evaluate # Evaluation and rollout generation tools
┃ ┣ 📜metrics.py
┃ ┗ 📜rollout.py
┣ 📂models # Baseline models
┃ ┣ 📜base.py # BaseModel class
┃ ┣ 📜egnn.py
┃ ┣ 📜gns.py
┃ ┣ 📜linear.py
┃ ┣ 📜painn.py
┃ ┣ 📜segnn.py
┃ ┗ 📜utils.py
┣ 📂train # Trainer method and training tricks
┃ ┣ 📜strats.py # Training tricks
┃ ┗ 📜trainer.py # Trainer method
┣ 📜defaults.py # Default values
┗ 📜utils.py
Contributing
Welcome! We highly appreciate Github issues and PRs.
You can also chat with us on Discord.
Contributing Guideline
If you want to contribute to this repository, you will need the dev depencencies, i.e.
install the environment with poetry install
without the --only main
flag.
Then, we also recommend you to install the pre-commit hooks
if you don't want to manually run pre-commit run
before each commit. To sum up:
git clone https://github.com/tumaer/lagrangebench.git
cd lagrangebench
poetry install
source $PATH_TO_LAGRANGEBENCH_VENV/bin/activate
# install pre-commit hooks defined in .pre-commit-config.yaml
# ruff is configured in pyproject.toml
pre-commit install
After you have run git add <FILE>
and try to git commit
, the pre-commit hook will
fix the linting and formatting of <FILE>
before you are allowed to commit.
You should also run the unit tests locally before creating a PR. Do this simply by:
# pytest is configured in pyproject.toml
pytest
Citation
The paper (at NeurIPS 2023 Datasets and Benchmarks) can be cited as:
@inproceedings{toshev2023lagrangebench,
title = {LagrangeBench: A Lagrangian Fluid Mechanics Benchmarking Suite},
author = {Artur P. Toshev and Gianluca Galletti and Fabian Fritz and Stefan Adami and Nikolaus A. Adams},
year = {2023},
url = {https://arxiv.org/abs/2309.16342},
booktitle = {37th Conference on Neural Information Processing Systems (NeurIPS 2023) Track on Datasets and Benchmarks},
}
The associated datasets can be cited as:
@dataset{toshev_2023_10021926,
author = {Toshev, Artur P. and Adams, Nikolaus A.},
title = {LagrangeBench Datasets},
month = oct,
year = 2023,
publisher = {Zenodo},
version = {0.0.1},
url = {https://zenodo.org/doi/10.5281/zenodo.10021925},
doi = {10.5281/zenodo.10021925},
}
Publications
The following further publcations are based on the LagrangeBench codebase:
- Learning Lagrangian Fluid Mechanics with E(3)-Equivariant Graph Neural Networks (GSI 2023), A. P. Toshev, G. Galletti, J. Brandstetter, S. Adami, N. A. Adams
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lagrangebench-0.1.2.tar.gz
.
File metadata
- Download URL: lagrangebench-0.1.2.tar.gz
- Upload date:
- Size: 54.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.10.13 Linux/6.2.0-1018-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0b16b8e86e4e6a7c4c7b2bd8de4a18a25598a5458c8832a8a67e106801edfeec |
|
MD5 | 2ac9167f5b490a50d6a30829f0b2e295 |
|
BLAKE2b-256 | 8c3c39ddd0f48f34f7cd85eb37bd0e30f6997f2d135b6d2d22e1a069fb76f220 |
File details
Details for the file lagrangebench-0.1.2-py3-none-any.whl
.
File metadata
- Download URL: lagrangebench-0.1.2-py3-none-any.whl
- Upload date:
- Size: 61.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.10.13 Linux/6.2.0-1018-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cb516feba0f17aad7bcb9c019ac3ea569d7435946556edaf8543c4da8df8c96a |
|
MD5 | ef121a67ade5750fc6a18dcf6553e054 |
|
BLAKE2b-256 | db68732e7bb1c4d66391881682dab425c2d9f0b07e03c204b2e1fe5acbc14c07 |