Skip to main content

Kernel attention implementation of Pytorch TransformerEncoderLayer

Project description

pykernsformer

alt text alt text alt text

The pykernsformer module extends the torch.nn.TransformerEncoderLayer class to include custom attention formulas.

Installation

You can install the pykernsformer package using pip as

pip install pykernsformer

Usage

pykernsformer comes with the following built in attention kernels.

pykernsformer Attention Formula Citation
attention Regular $softmax(\frac{QK^T}{\sqrt{d_k}})V$ Vaswani et al.
attention_linear Linear $\frac{QK^T}{\sum_k QK^T}V$
attention_periodic Periodic $softmax(-\frac{2\sin^2(\pi\frac{\sqrt{2 - 2q_ik_j^T}}{p})}{\sqrt{d_k}})V$
attention_LP Locally Periodic $softmax(-\frac{2\sin^2(\pi\frac{\sqrt{2 - 2\hat{q}_i\hat{k}_j^T}}{p})}{\sqrt{d_k}} + \frac{{q_i}{k_j^T}}{\sqrt{d_k}})V$
attention_RQ Rational Quadratic $\frac{\left( 1 + \frac{1}{\alpha \sqrt{d_k}} - \frac{2QK^T}{2 \alpha \sqrt{d_k}} \right)^{-\alpha}}{\sum_k \left( 1 + \frac{1}{\alpha \sqrt{d_k}} - \frac{2QK^T}{2 \alpha \sqrt{d_k}} \right)^{-\alpha}}V$ 

You can also implement your own attention function with the following signature:

def attention_custom(query, key, value, mask=None, dropout=None):

    [...]

    p_attn = [...] # the attention matrix

    [...]

    return torch.matmul(p_attn, value), p_attn

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pykernsformer-0.0.4.tar.gz (2.7 kB view details)

Uploaded Source

Built Distribution

pykernsformer-0.0.4-py3-none-any.whl (2.8 kB view details)

Uploaded Python 3

File details

Details for the file pykernsformer-0.0.4.tar.gz.

File metadata

  • Download URL: pykernsformer-0.0.4.tar.gz
  • Upload date:
  • Size: 2.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.1 pkginfo/1.7.0 requests/2.23.0 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.7.10

File hashes

Hashes for pykernsformer-0.0.4.tar.gz
Algorithm Hash digest
SHA256 156dc0f62a652adf97d593d6457e3cd3301ab5ee09b6a046057766e23d8b64a7
MD5 bd8bcad64e4f118f600ab37169c9a4c3
BLAKE2b-256 226f749f7b77f7b073f84d0a1d14db3a5e5340e7acc9076748c0ad01ad50f74f

See more details on using hashes here.

File details

Details for the file pykernsformer-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: pykernsformer-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 2.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.1 pkginfo/1.7.0 requests/2.23.0 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.7.10

File hashes

Hashes for pykernsformer-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 3609a2dc2ea652219983ffee48bd298a0b1eaa8e9d037327cfacea76b4b94bce
MD5 4dd7d761b4f7286cee4f999787eb9f45
BLAKE2b-256 4c1dc054a9bc3f7d56100e049c6c27ad4ebd053ab19f49cb4327bb5c5f383b8e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page