Skip to main content

1D, 2D, and 3D Sinusodal Positional Encodings in PyTorch

Project description

1D, 2D, and 3D Sinusodal Postional Encoding Pytorch

This is an implemenation of 1D, 2D, and 3D sinusodal positional encoding, being able to encode on tensors of the form (batchsize, x, ch), (batchsize, x, y, ch), and (batchsize, x, y, z, ch), where the positional encodings will be added to the ch dimension. The Attention is All You Need allowed for positional encoding in only one dimension, however, this works to extend this to 2 and 3 dimensions.

To install, simply run:

pip install positional-encodings

For more information, check out the github page.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

positional_encodings-1.0.5.tar.gz (2.5 kB view hashes)

Uploaded Source

Built Distribution

positional_encodings-1.0.5-py3-none-any.whl (3.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page