Skip to main content

PyTorch Skillful Nowcasting GAN Implementation

Project description

Skillful Nowcasting with Deep Generative Model of Radar (DGMR)

All Contributors

Implementation of DeepMind's Skillful Nowcasting GAN Deep Generative Model of Radar (DGMR) (https://arxiv.org/abs/2104.00954) in PyTorch Lightning.

This implementation matches as much as possible the pseudocode released by DeepMind. Each of the components (Sampler, Context conditioning stack, Latent conditioning stack, Discriminator, and Generator) are normal PyTorch modules. As the model training is a bit complicated, the overall architecture is wrapped in PyTorch Lightning.

The default parameters match what is written in the paper.

Installation

Clone the repository, then run

pip install -r requirements.txt
pip install -e .

Alternatively, you can also install through pip install dgmr

Training Data

The open-sourced UK training dataset has been mirrored to HuggingFace Datasets! This should enable training the original architecture on the original data for reproducing the results from the paper. The full dataset is roughly 1TB in size, and unfortunately, streaming the data from HF Datasets doesn't seem to work, so it has to be cached locally. We have added the sample dataset as well though, which can be directly streamed from GCP without costs.

The dataset can be loaded with

from datasets import load_dataset

dataset = load_dataset("openclimatefix/nimrod-uk-1km")

For now, only the sample dataset support streaming in, as its data files are hosted on GCP, not HF, so it can be used with:

from datasets import load_dataset

dataset = load_dataset("openclimatefix/nimrod-uk-1km", "sample", streaming=True)

The authors also used MRMS US precipitation radar data as another comparison. While that dataset was not released, the MRMS data is publicly available, and we have made that data available on HuggingFace Datasets as well here. This dataset is the raw 3500x7000 contiguous US MRMS data for 2016 through May 2022, is a few hundred GBs in size, with sporadic updates to more recent data planned. This dataset is in Zarr format, and can be streamed without caching locally through

from datasets import load_dataset

dataset = load_dataset("openclimatefix/mrms", "default_sequence", streaming=True)

This steams the data with 24 timesteps per example, just like the UK DGMR dataset. To get individual MRMS frames, instead of a sequence, this can be achieved through

from datasets import load_dataset

dataset = load_dataset("openclimatefix/mrms", "default", streaming=True)

Pretrained Weights

Pretrained weights are be available through HuggingFace Hub, currently weights trained on the sample dataset. The whole DGMR model or different components can be loaded as the following:

from dgmr import DGMR, Sampler, Generator, Discriminator, LatentConditioningStack, ContextConditioningStack
model = DGMR.from_pretrained("openclimatefix/dgmr")
sampler = Sampler.from_pretrained("openclimatefix/dgmr-sampler")
discriminator = Discriminator.from_pretrained("openclimatefix/dgmr-discriminator")
latent_stack = LatentConditioningStack.from_pretrained("openclimatefix/dgmr-latent-conditioning-stack")
context_stack = ContextConditioningStack.from_pretrained("openclimatefix/dgmr-context-conditioning-stack")
generator = Generator(conditioning_stack=context_stack, latent_stack=latent_stack, sampler=sampler)

Example Usage

from dgmr import DGMR
import torch.nn.functional as F
import torch

model = DGMR(
        forecast_steps=4,
        input_channels=1,
        output_shape=128,
        latent_channels=384,
        context_channels=192,
        num_samples=3,
    )
x = torch.rand((2, 4, 1, 128, 128))
out = model(x)
y = torch.rand((2, 4, 1, 128, 128))
loss = F.mse_loss(y, out)
loss.backward()

Citation

@article{ravuris2021skillful,
  author={Suman Ravuri and Karel Lenc and Matthew Willson and Dmitry Kangin and Remi Lam and Piotr Mirowski and Megan Fitzsimons and Maria Athanassiadou and Sheleem Kashem and Sam Madge and Rachel Prudden Amol Mandhane and Aidan Clark and Andrew Brock and Karen Simonyan and Raia Hadsell and Niall Robinson Ellen Clancy and Alberto Arribas† and Shakir Mohamed},
  title={Skillful Precipitation Nowcasting using Deep Generative Models of Radar},
  journal={Nature},
  volume={597},
  pages={672--677},
  year={2021}
}

Contributors ✨

Thanks goes to these wonderful people (emoji key):

Jacob Bieker
Jacob Bieker

💻
Johan Mathe
Johan Mathe

💻
Z1YUE
Z1YUE

🐛
Nan.Y
Nan.Y

💬
Taisanai
Taisanai

💬
cameron
cameron

💬
zhrli
zhrli

💬
Najeeb Kazmi
Najeeb Kazmi

💬
TQRTQ
TQRTQ

💬
Viktor Bordiuzha
Viktor Bordiuzha

💡
agijsberts
agijsberts

💻
Mews
Mews

⚠️
Aleksei Rutkovskii
Aleksei Rutkovskii

💻

This project follows the all-contributors specification. Contributions of any kind welcome!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dgmr-1.4.0.tar.gz (25.0 kB view details)

Uploaded Source

Built Distribution

dgmr-1.4.0-py3-none-any.whl (23.7 kB view details)

Uploaded Python 3

File details

Details for the file dgmr-1.4.0.tar.gz.

File metadata

  • Download URL: dgmr-1.4.0.tar.gz
  • Upload date:
  • Size: 25.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for dgmr-1.4.0.tar.gz
Algorithm Hash digest
SHA256 a099ceaef989d1ef49a9b0f2881f01891d8ee94f04fb939e960a235cb350d2a3
MD5 555e4a891cdd9d8104b85d911f5bde5f
BLAKE2b-256 b2e3bdb45e568073af2a38c90aac12ddc390d241cb5ac5bb25d51bbf5abc0bcc

See more details on using hashes here.

File details

Details for the file dgmr-1.4.0-py3-none-any.whl.

File metadata

  • Download URL: dgmr-1.4.0-py3-none-any.whl
  • Upload date:
  • Size: 23.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for dgmr-1.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c84a78e81419cd3ec6dc2447b4ac737e51035c7210c3831524166bed0edbed23
MD5 07bb9c306e8d7bc31f490a80bbfeb120
BLAKE2b-256 f9f56dc73593a924165ffd86e2560d1aea4003f650107b4f20b2538d6dc6d55a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page