Simple implementations of attention modules adapted for the biological data domain
Project description
:construction: THIS CODE IS BEING ACTIVELY DEVELOPED :construction:
Don't look for stability here (yet).
Why use this package?
There are already plenty of excellent implementations out there that allow you to test out the countless variants of transformers [1], [2]. This repository primarily separates itself from the previous in that it contains positional encodings schemes adapted to allow for irregularly-spaced positions in sequences.
Install
Since PyTorch is a dependency of bio-attention
, we recommend installing PyTorch independently first, as your system may require a specific version (e.g. CUDA drivers).
After PyTorch installation, bio-attention
can be installed using pip
pip install bio-attention
Note
This package used the be a 2D sliding window attention package. The current formulation of the package does not allow for this type of attention anymore (instead, I recommend to perform axial attention with alternating sliding window attention across one axis and full self-attention across the other). If you want to use 2D sliding window attention, check out the old version of this repo.
Usage
Package roadmap
- Embedding layers
- Continuous
- Discrete
- Binary
- Bin
- [~] Positional encoding schemes
- Sinusoidal
- Embedding
- Continuous
- Rotary
- AliBi
- DPB
- XL
- Test support for multi-dimensional inputs
- [~] Attention modules
- Vanilla
- Windowed
- Random
- Performer
- Encoder
- Decoder
- Cross
- Support for multi-dim inputs
- Add a warning if non-increasing positional indices are used with a decoder attention
- Add docs clarifying that clf tokens are automatically accounted for if no pos is provided for them
- Tests
- Typing
- Docs
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for bio_attention-0.0.10-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b98e402ec678f7be4c92d387640ad67c34a1023764d1d33cf119839658229185 |
|
MD5 | 31955b8d309ae288795ce6994695b231 |
|
BLAKE2b-256 | 53e5e8f6e93b4f900ae1ee72b5a35372e8e0f8aa3d5b6c68854b69b5ffbf5b05 |