PyTorch implementation of VQ-VAE
Project description
Pytorch VQVAE implementation
Example
from vqvae import VQVAE, sequential_encoder, sequential_decoder
from torch.optim import Adam
from functools import partial
input_channels = 3
output_channels = 3
embedding_length = 256
hidden_channels = 64
beta = 0.25
embedding_size = 512
opt = partial(Adam, lr=2e-4)
encoder = sequential_encoder(input_channels, embedding_size, hidden_channels) # Encoder from the paper
decoder = sequential_decoder(embedding_size, output_channels, hidden_channels) # Decoder from the paper
vqvae = VQVAE(encoder, decoder, beta, embedding_length, embedding_size, opt) # Pytorch-Lightning module,
# hence usable to train the model
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
vqvae-1.0.0.tar.gz
(4.1 kB
view details)
Built Distribution
vqvae-1.0.0-py3-none-any.whl
(4.7 kB
view details)
File details
Details for the file vqvae-1.0.0.tar.gz
.
File metadata
- Download URL: vqvae-1.0.0.tar.gz
- Upload date:
- Size: 4.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.15
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e954391d9b0b288a02baef8e6f12909a523d52eabee1141ebc3d5461f741bc77 |
|
MD5 | 7c86028c4b4c1328b7da287a43a3ea1b |
|
BLAKE2b-256 | 8a27c4acf2e2d0ba74889c121901a7878041c110c51d80e2d5787e8607846216 |
File details
Details for the file vqvae-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: vqvae-1.0.0-py3-none-any.whl
- Upload date:
- Size: 4.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.15
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cae9518a446ebce232db306298f92f1437551e9f2493ed3446927c28320b2e6c |
|
MD5 | b8a9269b876b3dc5e4b2f4ed756f5b47 |
|
BLAKE2b-256 | 1c5b511db8adc921adeff48d1440c762c4851f466d116a9a8d5aeafdae86ca62 |