Fast kernel for triangle self attetion.
Project description
Fused Triangle Self Attention kernel, written in triton. Basically flash attention, but for triangle self attention. Implementation heavily inspired by FlagAttention and the triton fused attention tutorial.
- n^2 memory complexity (vs n^3 for pure pytorch).
- Faster (~2x) backward pass than next fastest implementation I could find (DS4S evoformer kernel).
- Faster (~4x) forward pass than next fastest implementation I could find (DS4S evoformer kernel).
- As far as I can tell, faster than naieve implementation.
Plots
All done on a 3090 in bfloat16.
Forward
Backward
Todos:
- [] Try to train a model with it.
- [] Can we perform and of dq/db/dkv transposed?
- [] Rewrite autotuner
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
trifast-0.1.0.tar.gz
(15.9 MB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
trifast-0.1.0-py3-none-any.whl
(15.8 kB
view details)
File details
Details for the file trifast-0.1.0.tar.gz.
File metadata
- Download URL: trifast-0.1.0.tar.gz
- Upload date:
- Size: 15.9 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8f17ea1af9056ade1dc42bec7aaee7dd752994764a1a7321f617caac08a90bce
|
|
| MD5 |
2fd2e3c91dd30beff876d980ff21c1d8
|
|
| BLAKE2b-256 |
9857588a8107a7527a018bcad3d53e8e1d64bb715bce4ffc25233a1d7692b12f
|
File details
Details for the file trifast-0.1.0-py3-none-any.whl.
File metadata
- Download URL: trifast-0.1.0-py3-none-any.whl
- Upload date:
- Size: 15.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
990d321e2ecfb692c0b944b9db5878434f7e9d6c53c5c29651848b3f0c186dd6
|
|
| MD5 |
549251a88328df99c1922add26a05f3d
|
|
| BLAKE2b-256 |
6e3d2bc3de4686c833cebd20ae97974614f007ae6d5732a9b7a76209305599a2
|