(2) - Simplicial Attention
Project description
Simplicial Attention
Implementation of 2-simplicial attention proposed by Clift et al. (2019) and the recent attempt to make practical in Fast and Simplex, Roy et al. (2025)
Paper explanation by Gabriel Mongaras
Appreciation
- Tejas for finding my error in the Triton backwards kernel!
Install
$ pip install simplicial-attention
Usage
import torch
from simplicial_attention.triton_two_simplicial_attention import SlidingWindowTwoSimplicialMHA
higher_order_attn = SlidingWindowTwoSimplicialMHA(
dim = 512,
dim_head = 64,
heads = 8
).cuda()
tokens = torch.randn(2, 1024, 512).cuda()
assert higher_order_attn(tokens).shape == tokens.shape
Example
Enwik8, every 2 layers
$ pip install '.[examples]' && python train.py
Contributing
First install with pytest
$ pip install '.[test]'
Then add your code and make sure it passes
$ pytest tests
Citations
@misc{roy2025fastsimplex2simplicialattention,
title = {Fast and Simplex: 2-Simplicial Attention in Triton},
author = {Aurko Roy and Timothy Chou and Sai Surya Duvvuri and Sijia Chen and Jiecao Yu and Xiaodong Wang and Manzil Zaheer and Rohan Anil},
year = {2025},
eprint = {2507.02754},
archivePrefix = {arXiv},
primaryClass = {cs.LG},
url = {https://arxiv.org/abs/2507.02754},
}
@misc{clift2019logic2simplicialtransformer,
title = {Logic and the $2$-Simplicial Transformer},
author = {James Clift and Dmitry Doryn and Daniel Murfet and James Wallbridge},
year = {2019},
eprint = {1909.00668},
archivePrefix = {arXiv},
primaryClass = {cs.LG},
url = {https://arxiv.org/abs/1909.00668},
}
@article{Peng2024OnLO,
title = {On Limitations of the Transformer Architecture},
author = {Binghui Peng and Srini Narayanan and Christos Papadimitriou},
journal = {ArXiv},
year = {2024},
volume = {abs/2402.08164},
url = {https://api.semanticscholar.org/CorpusID:267636545}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file simplicial_attention-0.1.6.tar.gz.
File metadata
- Download URL: simplicial_attention-0.1.6.tar.gz
- Upload date:
- Size: 37.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.23
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c38d9a6e1fac59ad6902770bee6bc9c8ae82fb2cc04abf6a6bbd0ae614f47c92
|
|
| MD5 |
7e13d24322320580fbe2875fdd2051c1
|
|
| BLAKE2b-256 |
0259a8e6608de312d75016b213dffc8e50a65e5e390c899d0e10b8463dd5ba45
|
File details
Details for the file simplicial_attention-0.1.6-py3-none-any.whl.
File metadata
- Download URL: simplicial_attention-0.1.6-py3-none-any.whl
- Upload date:
- Size: 16.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.23
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e70ccc413ec690aec42bb44647b9a463a4117572c461b276ad2b4f1ad0434c67
|
|
| MD5 |
f252a52fe8fd3a3f9b129a62dd74ba3f
|
|
| BLAKE2b-256 |
1ea55d280292099370a040806115e8eff2a307dbee75c88b9397c608affe539c
|