Skip to main content

Amortized Reparametrization for Continuous Time Autoencoders (ARCTA)

Project description

Amortized reparametrization: Efficient and Scalable Variational Inference for Latent SDEs

Accompanying code for the NeurIPS 2023 paper by Kevin Course and Prasanth B. Nair.

Tutorials and documentation coming soon!

1. Installation

Installing the package

The package can be installed from PyPI:

pip install arlatentsde

Reproducing the experiment environment

We ran experiments on a Linux machine with CUDA 11.8. We used poetry to manage dependencies.

If you prefer a different environment manager, all dependencies are listed in the pyproject.toml.

To reproduce the experiment environment, first navigate to branch named neurips-freeze. Then install all optional dependencies required to run experiments,

poetry install --with dev,exps

To download all pretrained models, datasets, and figures we use repopacker:

repopacker download models-data-figs.zip
repopacker unpack models-data-figs.zip

2. Usage

The numerical studies can be rerun from the experiments directory using the command-line script main.py. All numerical studies follow the same basic structure: (i) generate / download, (ii) train model, and (iii) post process for plots and tables.

The script has the following syntax:

python main.py [experiment] [action]

The choices of experiments and actions are provided below:

  • Experiments:
    • predprey: Orders of magnitude magnitude fewer NFEs experiment
    • lorenz: Adjoint instabilities experiment
    • mocap: Motion capture benchmark
    • nsde-video: Neural SDE from video experiment
    • grad-variance: Gradient variance experiment
  • Actions:
    • get-data: Download / generate data
    • train: Train models
    • post-process: Post process for plots and tables

3. Reference

Course, K., Nair, P.B. Amortized Reparametrization: Efficient and Scalable Variational Inference for Latent SDEs.
In Proc. Advances in Neural Information Processing Systems, (2023).

@inproceedings{
course2023amortized,
title={Amortized Reparametrization: Efficient and Scalable Variational Inference for Latent {SDE}s},
author={Kevin Course and Prasanth B. Nair},
booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
year={2023},
url={https://openreview.net/forum?id=5yZiP9fZNv}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arlatentsde-0.2.0.tar.gz (35.9 kB view details)

Uploaded Source

Built Distribution

arlatentsde-0.2.0-py3-none-any.whl (40.9 kB view details)

Uploaded Python 3

File details

Details for the file arlatentsde-0.2.0.tar.gz.

File metadata

  • Download URL: arlatentsde-0.2.0.tar.gz
  • Upload date:
  • Size: 35.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.10.10 Darwin/23.0.0

File hashes

Hashes for arlatentsde-0.2.0.tar.gz
Algorithm Hash digest
SHA256 f73d6559432b140843697c1297e12f8e57b07f3c931a0a46f8a8405d9e23d9cd
MD5 7573764a61647262712956b85c7492aa
BLAKE2b-256 8cefd196560cd28132f141ae67b2159f38238e512326db67134eecbd126e802d

See more details on using hashes here.

File details

Details for the file arlatentsde-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: arlatentsde-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 40.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.10.10 Darwin/23.0.0

File hashes

Hashes for arlatentsde-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d018784aef72b68746b687cc1991d066236e99ba26bd0594a4fe63376bd46ccf
MD5 000bd12533ea7bf2c3da7a5d33992f6c
BLAKE2b-256 687175f23a7d0ad5e320e27365682f7f723aef99950ff8161ef4b78098404066

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page