Skip to main content

Graph Neural Network Library for continuous convolutions using separable basis functions in pyTorch.

Project description

Symmetric Fourier Basis Convolutions for Learning Lagrangian Fluid Simulations

Authors: Rene Winchenbach, Nils Thuerey

Accepted at: International Conference on Learning Representation (ICLR) 2024 - Vienna (as a Poster)

Repository: https://github.com/tum-pbs/SFBC ArXiV Paper: https://arxiv.org/abs/2403.16680

Supported Basis Functions:

As basis Functions our code base supports:

  • Radial Basis Functions ('gaussian', 'multiquadric', 'inverse_quadric', 'inverse_multiquadric', 'polyharmonic', 'bump')
  • Interpolation Schemes ('linear' [This is the approach used by the CConv paper by Ummenhofer et al], 'square' [Nearest Neighbor Interpolation])
  • B-Spline Schemes ('cubic_spline' [This is the SplineConv basis from Fey et al], 'quartic_spline', 'quintic_spline')
  • SPH Kernels ('wendland2', 'wendland4', 'wendland6', 'poly6', 'spiky')

All of these can be either called as they are ('rbf x'), normalized to be partitions of unity by widening their shape parameter ('abf x'), or normalized by dividing by the shape function ('ubf x'). To replicate the ressults of Fey et al using 'abf cubic_spline' is necessary, for example.

We also offer:

  • Fourier Terms ('ffourier' [Our primary approach], 'fourier' [which drops the first antisymmetric term]), each with suffixes ' even' and ' odd' to use even and odd symmetric terms
  • Chebyshev Terms ('chebyshev', 'chebyshev2')
  • An antisymmetric enforcement term ('dmcf' [This is the approach by Prantl et al])

Network Setup:

The code provides two primary classes BasisConv and BasisNetwork. The former is an individual basis convolution layer and the second is the network setup used for our publication.

Convolution Layer

The BasisConv class has the following arguments (src/BasisConvolution/convLayerv2.py):

BasisConv(inputFeatures: int, outputFeatures: int, dim: int = 2,
        basisTerms = [4, 4], basisFunction = 'linear', basisPeriodicity = False,
        linearLayerActive = False, linearLayerHiddenLayout = [32, 32], linearLayerActivation = 'relu',
        biasActive= False, feedThrough = False,
        preActivation None, postActivation = None, cutlassBatchSize = 16,
        cutlassNormalization = False, initializer = 'uniform', optimizeWeights = False, exponentialDecay = False
    )

Convolution Network

The BasisNetwork class has the following arguments (src/BasisConvolution/convNetv2.py):

BasisNetwork(self, fluidFeatures, boundaryFeatures = 0, layers = [32,64,64,2], 
    denseLayer = True, 
    activation = 'relu', coordinateMapping = 'cartesian', 
    dims = [8],  rbfs = ['linear', 'linear'], windowFn = None, 
    batchSize = 32, ignoreCenter = True, 
    normalized = False, outputScaling = 1/128, 
    layerMLP = False, MLPLayout = [32,32], 
    convBias = False, outputBias = True, 
    initializer = 'uniform', optimizeWeights = False, exponentialDecay = True, 
    inputEncoder = None, outputDecoder = None, edgeMLP = None, vertexMLP = None):

For more information, see the respective source files.

Inference

The primary forward function of the network can be called as

model(fluidFeatures, fi, fj, fluidEdgeLengths, 
        boundaryFeatures, bf, bb, boundaryEdgeLenghts)

For this call fluidFeatures are per-vertex features for the primary point cloud and boundaryFeatures are per-vertex features for the secondary point cloud. [fi, fj] are the adjacency matrix of the primary point cloud in COO format and [bf, bb] are the adjacency matrix from the secondary (bb) to the primary (bf) point cloud. fluidEdgeLengths and boundaryEdgeLengths are the relative distances between nodes normalized by the node support radius (i.e., in the range of $[-1,1]^d$). The boundary information can be 'None' if not used. For convenience we also have a runInference function (util/network) that takes as argument the simulation state, config and the model.

Training example

As an example for training, see notebooks/exampleTraining. This notebook contains a simple training script that learns the SPH density kernel function for any of the four included datasets in a small ablation study. Here's an example result of the training for test case II with 5 different basis functions (Fourier, Fourier even terms only, Fourier odd terms only, Linear and Chebyshev) with 3 different basis term counts (2,4,8):

alt text

You can also find an example of this ablation study on Google Colab

Datasets:

This paper included four datasets in its evalutions. You can find a tool to visualize the datasets under notebooks/datasetVisualizer. Summary information:

Test Case Scenario Size Link
I compressible 1D 7.9GByte https://huggingface.co/datasets/Wi-Re/SFBC_dataset_I
II WCSPH 2D 45 GByte https://huggingface.co/datasets/Wi-Re/SFBC_dataset_II
III IISPH 2D 2.1 GByte https://huggingface.co/datasets/Wi-Re/SFBC_dataset_III
IV 3D Toy 1.2 GByte https://huggingface.co/datasets/Wi-Re/SFBC_dataset_IV

Test Case I:

This test case was a pseudo-compressible 1D SPH simulation with random initial conditions. The dataset comprises 36 files with 2048 timesteps and 2048 particles each. Example:

alt text

You can find the dataset here (size approximately 7.9 GByte), the dataset is also a submodule in this repo under datasets/SFBC_dataset_I.

Test Case II:

This test case was a weakly-compressible 2D SPH simulation with random initial conditions and enclosed by a rigid boundary. The dataset comprises 36 simulations for training and 16 for testing each with 4096 timesteps and 4096 particles. Example:

alt text

You can find the dataset here (size approximately 45 GByte), the dataset is also a submodule in this repo under datasets/SFBC_dataset_II.

Test Case III:

This test case was an incompressible 2D SPH simulation where to randomly sized blobs of liquid collide in free-space. The dataset comprises 64 simulations for training and 4 for testing each with 128 timesteps and 4096 particles. Example:

alt text

You can find the dataset here (size approximately 2.1 GByte), the dataset is also a submodule in this repo under datasets/SFBC_dataset_III.

Test Case IV:

The last test case is a toy-problem to evaluate SPH kernel learning in a 3D setting. For this setup we sampled 4096 particles in a $[-1,1]^3$ domain with random (including negative) massses and additional jitter on the particle possitions. The test set contains a setup with no jitter (1024 seeds), low jitter (1024 seeds), medium jitter (1 seed) and high jitter (1 seed). Example (low jitter):

alt text

You can find the dataset here (size approximately 1.2 GByte), the dataset is also a submodule in this repo under datasets/SFBC_dataset_IV.


This work was supported by the DFG Individual Research Grant TH 2034/1-2.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

basisconvolution-0.1.2.tar.gz (69.7 kB view details)

Uploaded Source

Built Distribution

BasisConvolution-0.1.2-py3-none-any.whl (82.9 kB view details)

Uploaded Python 3

File details

Details for the file basisconvolution-0.1.2.tar.gz.

File metadata

  • Download URL: basisconvolution-0.1.2.tar.gz
  • Upload date:
  • Size: 69.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for basisconvolution-0.1.2.tar.gz
Algorithm Hash digest
SHA256 6e0f5844b9aaf342017d89b2b29b0161574cfa1d1fafb41d6c6ffd1f1b5c60b4
MD5 14f93af45b59ee7cd783fdd8b5832484
BLAKE2b-256 f2a717d76b7ae704d4ab34a25f3e2d5107428d9c57889c397e62d8beb744cb93

See more details on using hashes here.

File details

Details for the file BasisConvolution-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for BasisConvolution-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b3ee9c15feb37aedb0480e3a595491a3b13c1900082ae83735e9bb76b0d00514
MD5 8b41c55ea58f54ef8dc93ae7df41dd0a
BLAKE2b-256 0090a1c1cd215e3b90cb4d4f6e8816162f8ad37563ddd93e57bcd674aa63de10

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page