Skip to main content

Learning Partial Differential Equations

Project description

Learning Partial Differential Equations

PyPI - Python Version PyPI - License Python package PyPI downloads Downloads

INSTALLATION

By way of pip:

pip install lpde

By way of source

git clone https://github.com/fkemeth/lpde
cd lpde
pip install .

USAGE

This python package contains functions to learn partial differential equations (PDE) from data.

In order to learn a PDE on a set of training data that can be used for prediction, several parameters have to be specified. These can be defined in a config.cfg file, which is read using a config parser. The following hyper parameters can be specified:

  • Under the subsection SYSTEM, parameters used for generating the data are defined. In the example considered here, these included

    • n_time_steps: The number of time steps at which training data is collected.
    • n_grid_points: The number of spatial grid points at which training data is collected.
    • length: The length of the one-dimensional spatial interval. Together with n_grid_points, this defines the spatial resolution delta_x that is also used to calculate the partial derivatives.
    • tmin: The time step above which training and test data is collected.
    • tmax: The time step until which training and test data is collected. Together with tmin and n_time_steps, this is used to calculate the temporal resolution delta_t.
    • use_fd_dt: If the time derivative of the variables at each point in space and time shall be calculated using finite differences.
    • fd_dt_acc: The accuracy order of the finite differences for computing the time derivative of the variables.
  • Under the subsection MODEL, parameters specifying the neural network PDE are defined. These parameter are used to create and object of the Network class, and include

    • kernel_size: The width of the finite difference stencil used to calculate input spatial derivatives.
    • device: If to use either 'cpu' or 'cuda.
    • use_param: Boolean that specifies if to use additional parameters as input to the PDE. This is required if one wants to do bifurcation analysis of the learned PDE model.
    • num_params: If use_param is True, then here the number of additional system parameters have to be specified.
    • n_filters: The number of neurons in each layer of the PDE model.
    • n_layers: The number of layers of the PDE model.
    • n_derivs: The number of derivatives used in the PDE model.
  • Under the subsection TRAINING, hyper parameters used for training the neural network PDE are specified. These are used to create an object of the Model class, a wrapper around the Network object, and include

    • batch_size: Batch size used for training.
    • lr: Initial learning rate used for training.
    • weight_decay: Strength of the L2 regularization applied to the neural network weights.
    • epochs: Number of epochs/tranining iterations.
    • reduce_factor: For reduced-learning-rate-on-plateau scheduler. Factor by which to reduce the learning rate.
    • patience: For reduced-learning-rate-on-plateau scheduler. Number of epochs to wait before reducing learning rate.

EXAMPLE

The usage is best illustrated on an example. Here, we show this on simulation data of an actual PDE, the complex Ginzburg-Landau equation, with spatio-temporally chaotic dynamics, which is solved numerically on a one-dimensional periodic domain.

This example can be run by using

cd example/cgle/
python run.py

The training data thereby looks like the data shown in the figure below.

Training data

Using the hyperparameters defined in config.cfg, a neural network PDE is learned on the data shown in the figure above, by optimizing its weights using backprobagation and the PyTorch framework.

The trained neural network PDE can then be used to make predictions on test data. This is shown in the figure below, where on the left the actual test data is shown, and on the right the predicted data is shown, obtained by integrating an initial snapshot at t=0 forward in time using the learned PDE model.

Test data and predictions

See this GitHub repository for further example usages.

ISSUES

For questions, please contact (felix@kemeth.de), or visit the GitHub repository.

LICENCE

This work is licenced under MIT License. Please cite

"Learning emergent partial differential equations in a learned emergent space" F.P. Kemeth et al. Nature Communications 13, Article number: 3318 (2022) (https://www.nature.com/articles/s41467-022-30628-6)

if you use this package for publications.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lpde-0.1.1.tar.gz (13.4 kB view details)

Uploaded Source

Built Distribution

lpde-0.1.1-py3-none-any.whl (14.0 kB view details)

Uploaded Python 3

File details

Details for the file lpde-0.1.1.tar.gz.

File metadata

  • Download URL: lpde-0.1.1.tar.gz
  • Upload date:
  • Size: 13.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.0

File hashes

Hashes for lpde-0.1.1.tar.gz
Algorithm Hash digest
SHA256 a9f53a71449cae92c01a7ceb4ba0c93ec86d0de7d1daea950d5d392b6a7477b3
MD5 51e51f0129d96e53987d430e002ce42f
BLAKE2b-256 3218ff89e7e06f5dc90f48b51775bf6f890b4db868af2dbed11d3f01759961d8

See more details on using hashes here.

File details

Details for the file lpde-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: lpde-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 14.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.0

File hashes

Hashes for lpde-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e6f07f5eeb979bb3a58a36db3cb3fe81a002cd7830c50ee711e4d0c1e7d3d3de
MD5 00b8379d0bca69ab0606271395753738
BLAKE2b-256 5ddfe8d8b3f563fbcf6785fe9c1412dd515ca3806c38e81e3d756e3b4aad5b79

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page