Skip to main content

PyTorch implementation of Glow

Project description

Glow: Generative Flow with Invertible 1x1 Convolutions [Work in Progress]

Unofficial PyTorch implementation of "Glow: Generative Flow with Invertible 1x1 Convolutions"

The original paper can be found here.

The code is based off another implementation found here.

This repository contain the Glow model code and associated training / sampling scripts.

This repository is a work in progress. Default parameters may not be optimal!

Usage

Glow Training

Run Glow training using config file cfg.toml. Defaults to config/cifar10.toml

python main.py --cfg-path cfg.toml --no-amp

currently recommend NOT using automatic mixed precision (AMP)

Other useful flags:

--nb-samples            # number of samples to generate when evaluating [16]
--resume                # resume training from specified checkpoint 
--seed                  # set RNG seed 
--no-save               # disable saving of checkpoints [False]
--no-cuda               # disable the use of CUDA device [False]
--no-amp                # disable the use of automatic mixed precision [False]
--nb-workers            # set number of dataloader workers. [4]
--no-grad-checkpoint    # don't checkpoint gradients [False]
--temperature           # set temperature when sampling at evaluation [0.7]

Glow Sampling

Run Glow sampling using config file cfg.toml from checkpoint checkpoint.pt using sample mode mode:

python main.py --sample --sample-mode mode --resume checkpoint.pt --cfg-path cfg.toml --no-amp

Other flags from training will also work during sampling.

The sampling modes are:

  • normal: samples random latent and displays corresponding samples, saving to sample.jpg
  • vtemp: samples random latent and varies temperature, dumping samples samples-vtemp/
  • interpolate: computes latent of dataset items, then linearly interpolates between them, dumping samples in samples-interpolate/

Samples

TODO: add (nice) sample outputs

Checkpoints

TODO: add pretrained checkpoints

TODO:

  • Glow Model
  • Training script
  • Sampling script
  • Gradient checkpoints
  • PyPi library
  • Add pretrained models / nice samples

Citations:

Glow: Generative Flow with Invertible 1x1 Convolutions

Diederik P. Kingma, Prafulla Dhariwal

@misc{kingma2018glow,
      title={Glow: Generative Flow with Invertible 1x1 Convolutions}, 
      author={Diederik P. Kingma and Prafulla Dhariwal},
      year={2018},
      eprint={1807.03039},
      archivePrefix={arXiv},
      primaryClass={stat.ML}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-glow-0.0.1.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

pytorch_glow-0.0.1-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-glow-0.0.1.tar.gz.

File metadata

  • Download URL: pytorch-glow-0.0.1.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for pytorch-glow-0.0.1.tar.gz
Algorithm Hash digest
SHA256 1ccf0f4e34ece99a327ea92b64d95fc4d474e74d6397edf2e22311289e945330
MD5 e7cd4f6de437d5513fd84ee0b95f634a
BLAKE2b-256 0c87827856d97edb481ae132443494e18ec065e812407464a348866f06f222e9

See more details on using hashes here.

File details

Details for the file pytorch_glow-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: pytorch_glow-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for pytorch_glow-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a5a5529d49f16c6c0599b38158e0e66dfa96087c53c13b1066911418b7b2aa19
MD5 1f03e5b8b06c73920c73d19e13a390f1
BLAKE2b-256 75dd340a5d3466e112400c0a90a8f354936e0a7b5eff984c17403d9084ff786d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page