Skip to main content

Differentiable Laplace Reconstructions in PyTorch

Project description

PyTorch Implementation of Differentiable Laplace Reconstructions

Documentation Status Tests arXiv License: MIT

This library provides Inverse Laplace Transform (ILT) algorithms implemented in PyTorch. Backpropagation through differential equation (DE) solutions in the Laplace domain is supported using the Riemann stereographic projection for better global representation of the complex Laplace domain. For usage for DE representations in the Laplace domain in deep learning applications, see reference [1].

Installation

To install latest stable version:

pip install torchlaplace

To install the latest on GitHub:

pip install git+https://github.com/samholt/NeuralLaplace.git

Tutorials

  1. Tutorial: Laplace Reconstruct Test In Colab
  2. Tutorial: Inverse Laplace Transform Algorithms Test In Colab

Examples

Examples are placed in the examples directory.

We encourage those who are interested in using this library to take a look at examples/simple_demo.py for understanding how to use torchlaplace to fit a DE system.

Lotka Volterra DDE Demo

Basic usage

This library provides one main interface laplace_reconstruct which uses a selected inverse Laplace transform algorithm to reconstruct trajectories from a provided parameterized Laplace representation functional $\mathbf{F}(\mathbf{p},\mathbf{s})$,

$$\mathbf{x}(t) = \text{inverse laplace transform}(\mathbf{F}(\mathbf{p},\mathbf{s}), t)$$

Where $\mathbf{p}$ is a Tensor encoding the initial system state as a latent variable, and $t$ is the time points to reconstruct trajectories for.

This can be used by

from torchlaplace import laplace_reconstruct

laplace_reconstruct(laplace_rep_func, p, t)

where laplace_rep_func is any callable implementing the parameterized Laplace representation functional $\mathbf{F}(\mathbf{p},\mathbf{s})$, p is a Tensor encoding the initial state of shape $(\text{MiniBatchSize},\text{K})$. Where $\text{K}$ is a hyperparameter, and can be set by the user. Finally, t is a Tensor of shape $(\text{MiniBatchSize},\text{SeqLen})$ or $(\text{SeqLen})$ containing the time points to reconstruct the trajectories for.

Note that this is not numerically stable for all ILT methods, however should probably be fine with the default fourier (fourier series inverse) ILT algorithm.

The parameterized Laplace representation functional laplace_rep_func, $\mathbf{F}(\mathbf{p},\mathbf{s})$ also takes an input complex value $\mathbf{s}$. This $\mathbf{s}$ is used internally when reconstructing a specified time point with the selected inverse Laplace transform algorithm ilt_algorithm.

The biggest gotcha is that laplace_rep_func must be a nn.Module when using the laplace_rep_func function. This is due to internally needing to collect the parameters of the parameterized Laplace representation.

To replicate the experiments in [1] see the in the experiments directory.

Keyword arguments for laplace_rep_func

Keyword arguments:

  • recon_dim (int): trajectory dimension for a given time point. Corresponds to dim $d_{\text{obs}}$. If not explicitly specified, will use the same last dimension of p, i.e. $\text{K}$.
  • ilt_algorithm (str): inverse Laplace transform algorithm to use. Default: fourier. Available are {fourier, dehoog, cme, fixed_tablot, stehfest}. See api documentation on ILTs for further details.
  • use_sphere_projection (bool): this uses the laplace_rep_func in the stereographic projection of the Riemann sphere. Default True.
  • ilt_reconstruction_terms (int): number of ILT reconstruction terms, i.e. the number of complex $s$ points in laplace_rep_func to reconstruct a single time point.

List of ILT Algorithms:

ILT algorithms implemented:

  • fourier Fourier Series Inverse [default].
  • dehoog DeHoog (Accelerated version of Fourier) - Slower inference in comparison.
  • cme Concentrated Matrix Exponentials.
  • fixed_tablot Fixed Tablot.
  • stehfest Gaver-Stehfest.

For most problems, good choices are the default fourier. However other ILT algorithms may be more appropriate when using higher ILT reconstruction terms, such as the cme algorithm. Some allow trade-offs between speed and accuracy, for example dehoog is very accurate if the representation is known or exact, however is slow and can be unstable to use when learning the correct representation.

Detailed documentation

For detailed documentation see the official docs.

Frequently Asked Questions

Take a look at our FAQ for frequently asked questions.

References

For usage for DE representations in the Laplace domain and leveraging the stereographic projection and other applications see:

[1] Samuel Holt, Zhaozhi Qian, and Mihaela van der Schaar. "Neural laplace: Learning diverse classes of differential equations in the laplace domain." International Conference on Machine Learning. 2022. [arxiv]


If you found this library useful in your research, please consider citing.

@inproceedings{holt2022neural,
  title={Neural Laplace: Learning diverse classes of differential equations in the Laplace domain},
  author={Holt, Samuel I and Qian, Zhaozhi and van der Schaar, Mihaela},
  booktitle={International Conference on Machine Learning},
  pages={8811--8832},
  year={2022},
  organization={PMLR}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

torchlaplace-0.0.4-py3-none-macosx_10_14_x86_64.whl (3.1 MB view details)

Uploaded Python 3 macOS 10.14+ x86-64

torchlaplace-0.0.4-py3-none-any.whl (3.1 MB view details)

Uploaded Python 3

File details

Details for the file torchlaplace-0.0.4-py3-none-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for torchlaplace-0.0.4-py3-none-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 15b32cd80fe9b2ffc58e1ba01e1a800c4dc82ad540779697abe7933a88450359
MD5 845a303af16a01cc75d30c2791349894
BLAKE2b-256 1ababee07e7d9bb3bf50f5bea58280a7e72fdbffb14f2af2b08652c42b8a36a9

See more details on using hashes here.

File details

Details for the file torchlaplace-0.0.4-py3-none-any.whl.

File metadata

File hashes

Hashes for torchlaplace-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 2aab5102dc0f78b20f8c66315bf306e969c6f464f7bb0d2292b17b1f8cd7ffa5
MD5 442224a62e1add80c386995d632a6d80
BLAKE2b-256 061c3bd6aa6c60523406b584fb289f53c037c457b635462ab2c46407c0e4ff48

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page