Skip to main content

LagrangeBench: A Lagrangian Fluid Mechanics Benchmarking Suite

Project description

LagrangeBench: A Lagrangian Fluid Mechanics Benchmarking Suite

Installation

Standalone library

Install the core lagrangebench library from PyPi as

pip install lagrangebench --extra-index-url=https://download.pytorch.org/whl/cpu

Note that by default lagrangebench is installed without JAX GPU support. For that follow the instructions in the GPU support section.

Clone

Clone this GitHub repository

git clone https://github.com/tumaer/lagrangebench.git
cd lagrangebench

Install the dependencies with Poetry (>=1.6.0)

poetry install --only main

Alternatively, a requirements file is provided. It directly installs the CUDA version of JAX.

pip install -r requirements_cuda.txt

For a CPU version of the requirements file, use the one in docs/requirements.txt.

GPU support

To run JAX on GPU, follow the Jax CUDA guide, or in general run

pip install --upgrade jax[cuda11_pip]==0.4.18 -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
# or, for cuda 12
pip install --upgrade jax[cuda12_pip]==0.4.18 -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html

Usage

Standalone benchmark library

A general tutorial is provided in the example notebook "Training GNS on the 2D Taylor Green Vortex" under ./notebooks/tutorial.ipynb on the LagrangeBench repository. The notebook covers the basics of LagrangeBench, such as loading a dataset, setting up a case, training a model from scratch and evaluating it's performance.

Running in a local clone (main.py)

Alternatively, experiments can also be set up with main.py, based around extensive YAML config files and cli arguments (check configs/ and experiments/configs.py). By default, passed cli arguments will overwrite the YAML config. When loading a saved model with --model_dir the config from the checkpoint is automatically loaded and training is restarted.

Train

For example, to start a GNS run from scratch on the RPF 2D dataset use

python main.py --config configs/rpf_2d/gns.yaml

Some model presets can be found in ./configs/.

If --mode=all, then training (--mode=train) and subsequent inference (--mode=infer) on the test split will be run in one go.

Restart training

To restart training from the last checkpoint in --model_dir use

python main.py --model_dir ckp/gns_rpf2d_yyyymmdd-hhmmss

Inference

To evaluate a trained model from --model_dir on the test split (--test) use

python main.py --model_dir ckp/gns_rpf2d_yyyymmdd-hhmmss/best --rollout_dir rollout/gns_rpf2d_yyyymmdd-hhmmss/best --mode infer --test

If the default --out_type_infer=pkl is active, then the generated trajectories and a metricsYYYY_MM_DD_HH_MM_SS.pkl file will be written to the --rollout_dir. The metrics file contains all --metrics_infer properties for each generated rollout.

Datasets

The datasets are hosted on Zenodo under the DOI: 10.5281/zenodo.10021925. When creating a new dataset instance, the data is automatically downloaded. Alternatively, to manually download them use the download_data.sh shell script, either with a specific dataset name or "all". Namely

  • Taylor Green Vortex 2D: bash download_data.sh tgv_2d datasets/
  • Reverse Poiseuille Flow 2D: bash download_data.sh rpf_2d datasets/
  • Lid Driven Cavity 2D: bash download_data.sh ldc_2d datasets/
  • Dam break 2D: bash download_data.sh dam_2d datasets/
  • Taylor Green Vortex 3D: bash download_data.sh tgv_3d datasets/
  • Reverse Poiseuille Flow 3D: bash download_data.sh rpf_3d datasets/
  • Lid Driven Cavity 3D: bash download_data.sh ldc_3d datasets/
  • All: bash download_data.sh all datasets/

Notebooks

Two notebooks dedicated to the datases are provided:

Directory structure

📦lagrangebench
 ┣ 📂case_setup     # Case setup manager
 ┃ ┣ 📜case.py      # CaseSetupFn class
 ┃ ┣ 📜features.py  # Feature extraction
 ┃ ┗ 📜partition.py # Alternative neighbor list implementations
 ┣ 📂data           # Datasets and dataloading utils
 ┃ ┣ 📜data.py      # H5Dataset class and specific datasets
 ┃ ┗ 📜utils.py
 ┣ 📂evaluate       # Evaluation and rollout generation tools
 ┃ ┣ 📜metrics.py
 ┃ ┗ 📜rollout.py
 ┣ 📂models         # Baseline models
 ┃ ┣ 📜base.py      # BaseModel class
 ┃ ┣ 📜egnn.py
 ┃ ┣ 📜gns.py
 ┃ ┣ 📜linear.py
 ┃ ┣ 📜painn.py
 ┃ ┣ 📜segnn.py
 ┃ ┗ 📜utils.py
 ┣ 📂train          # Trainer method and training tricks
 ┃ ┣ 📜strats.py    # Training tricks
 ┃ ┗ 📜trainer.py   # Trainer method
 ┣ 📜defaults.py    # Default values
 ┗ 📜utils.py

Citation

The paper (at NeurIPS 2023 Datasets and Benchmarks) can be cited as:

@inproceedings{toshev2023lagrangebench,
    title      = {LagrangeBench: A Lagrangian Fluid Mechanics Benchmarking Suite},
    author     = {Artur P. Toshev and Gianluca Galletti and Fabian Fritz and Stefan Adami and Nikolaus A. Adams},
    year       = {2023},
    url        = {https://arxiv.org/abs/2309.16342},
    booktitle  = {37th Conference on Neural Information Processing Systems (NeurIPS 2023) Track on Datasets and Benchmarks},
}

The associated datasets can be cited as:

@dataset{toshev_2023_10021926,
  author       = {Toshev, Artur P. and Adams, Nikolaus A.},
  title        = {LagrangeBench Datasets},
  month        = oct,
  year         = 2023,
  publisher    = {Zenodo},
  version      = {0.0.1},
  url          = {https://zenodo.org/doi/10.5281/zenodo.10021925},
  doi          = {10.5281/zenodo.10021925},
}

Publications

The following further publcations are based on the LagrangeBench codebase:

  1. Learning Lagrangian Fluid Mechanics with E(3)-Equivariant Graph Neural Networks (GSI 2023), A. P. Toshev, G. Galletti, J. Brandstetter, S. Adami, N. A. Adams

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lagrangebench-0.0.2.tar.gz (48.9 kB view details)

Uploaded Source

Built Distribution

lagrangebench-0.0.2-py3-none-any.whl (57.6 kB view details)

Uploaded Python 3

File details

Details for the file lagrangebench-0.0.2.tar.gz.

File metadata

  • Download URL: lagrangebench-0.0.2.tar.gz
  • Upload date:
  • Size: 48.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.10.12 Linux/6.2.0-36-generic

File hashes

Hashes for lagrangebench-0.0.2.tar.gz
Algorithm Hash digest
SHA256 3855ae90f09222243483ba52cd203f411920239a258b8b0e6e2034ea2fdea711
MD5 b9e2d399a0982b3a5d6cb0c68926352f
BLAKE2b-256 a8f2d73cd7d9525cc00350a4da7963c31fdd02ff7c8e3dacd7953bb18570b446

See more details on using hashes here.

File details

Details for the file lagrangebench-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: lagrangebench-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 57.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.10.12 Linux/6.2.0-36-generic

File hashes

Hashes for lagrangebench-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 7c73d6c24e70b7ba51787f319577a953316d5621c49f1e01e199d80c6e2deca0
MD5 6fb74c381c423c7daa1a60c81db1e7df
BLAKE2b-256 1b0ee4f6e5eee249023f332c81b7ffce4794787001c8b1af5e85de914c80831a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page