Monotonic attention implementation
Project description
monotonic-attention
Monotonic attention as a probabilistic graphical model
Write-up explaining how this works
Check out the examples/
directory for more information.
Getting Started
Install from PyPi:
pip install monotonic-attention
Install from source:
pip install git+https://github.com/codekansas/monotonic-attention.git
You should also install Triton if you plan to use the GPU kernels (highly recommended):
pip install triton
Usage
from monotonic_attention import OneToManyMultiheadMonotonicAttention
# Many keys mapped to a single query.
attn = OneToManyMultiheadMonotonicAttention(
mode="many_keys_one_query",
embed_dim=1024,
num_heads=16,
)
output = attn(query, key, value)
# Many queries mapped to a single key.
attn = OneToManyMultiheadMonotonicAttention(
mode="many_queries_one_key",
embed_dim=1024,
num_heads=16,
)
output = attn(query, key, value)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
monotonic-attention-0.0.4.tar.gz
(14.6 kB
view hashes)
Built Distribution
Close
Hashes for monotonic-attention-0.0.4.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9801b205c8dc5a5cab197f971e58c9ea96bf738d1f442f51203753cdf077b915 |
|
MD5 | 4955805e5a21b77c11db5a4d7bf30e63 |
|
BLAKE2b-256 | 11247f8648db9fbedf36a021e58a9214531d7f5e7edca52b3e42452dde23b259 |
Close
Hashes for monotonic_attention-0.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4ff44c3e976c2cd8d3c6a25277a2652905b65c364c779bc9b4792849a4ce86fb |
|
MD5 | 1662fbf3ee3e92489941fd183b167ae9 |
|
BLAKE2b-256 | 7d736b520a9e413aca23551b0b0d683177a2fbe8db23ec8585b3e46898662ab6 |