Monotonic attention implementation
Project description
monotonic-attention
Monotonic attention as a probabilistic graphical model
Write-up explaining how this works
Check out the examples/
directory for more information.
Getting Started
Install from PyPi:
pip install monotonic-attention
Install from source:
pip install git+https://github.com/codekansas/monotonic-attention.git
You should also install Triton if you plan to use the GPU kernels (highly recommended):
pip install triton
Usage
from monotonic_attention import OneToManyMultiheadMonotonicAttention
# Many keys mapped to a single query.
attn = OneToManyMultiheadMonotonicAttention(
mode="many_keys_one_query",
embed_dim=1024,
num_heads=16,
)
output = attn(query, key, value)
# Many queries mapped to a single key.
attn = OneToManyMultiheadMonotonicAttention(
mode="many_queries_one_key",
embed_dim=1024,
num_heads=16,
)
output = attn(query, key, value)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
monotonic-attention-0.0.3.tar.gz
(14.5 kB
view hashes)
Built Distribution
Close
Hashes for monotonic-attention-0.0.3.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2f3b9f84ff540d81a73e5595e07fb4dc2ba8b413b4e15a54773d7871b091ca81 |
|
MD5 | e72670f17168ad13ea68c6a6e90707d5 |
|
BLAKE2b-256 | 1960f1d636c6eacd67b575ef45e577acf5ab7e194f5d32f74d1dbefc2a86ea0f |
Close
Hashes for monotonic_attention-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1d11a6f3bef4cd4d9a59b4792ce017188114f8c713370f03719fd4afb8dac6dd |
|
MD5 | c993a6bee6968327d33fa68a1421d87f |
|
BLAKE2b-256 | 47f01999f34ad38802315dfe0f19f69f7486d18105b0c4f3427878f80605c308 |