Skip to main content

1D, 2D, and 3D Sinusodal Positional Encodings in PyTorch

Project description

1D, 2D, and 3D Sinusodal Postional Encoding Pytorch

This is an implemenation of 1D, 2D, and 3D sinusodal positional encoding, being able to encode on tensors of the form (batchsize, x, ch), (batchsize, x, y, ch), and (batchsize, x, y, z, ch), where the positional encodings will be added to the ch dimension. The Attention is All You Need allowed for positional encoding in only one dimension, however, this works to extend this to 2 and 3 dimensions.

To install, simply run:

pip install positional-encodings

For more information, check out the github page.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for positional-encodings, version 1.0.5
Filename, size File type Python version Upload date Hashes
Filename, size positional_encodings-1.0.5-py3-none-any.whl (3.7 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size positional_encodings-1.0.5.tar.gz (2.5 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page