A PyTorch wrapper for masked convolutions
Project description
Masked Convolution
A PyTorch implementation of a thin wrapper for masked convolutions.
What are masked convolutions?
Similarly to partial convolutions, masked convolutions mask a part of the kernel, essentially ignoring data at specific locations. For an example, consider
a = [1, 2, 3, 4, 5]
assuming we have a convolution kernel
kernel = [1, 1, 1]
convolving over a
would give us
a_conv = [6, 9, 12]
However, if we were to mask the convolution kernel with a mask
mask = [1, 0, 1]
masked convolving over a
would return
a_masked_conv = [4, 6, 8]
One use of masked convolutions is emulating skip-grams.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.