Simple implementations of attention modules adapted for the biological data domain
Project description
:construction: THIS CODE IS BEING ACTIVELY DEVELOPED :construction:
Don't look for stability here (yet).
Why use this package?
There are already plenty of excellent implementations out there that allow you to test out the countless variants of transformers [1], [2]. This repository primarily separates itself from the previous in that it contains positional encodings schemes adapted to allow for irregularly-spaced positions in sequences.
Install
Since PyTorch is a dependency of bio-attention
, we recommend installing PyTorch independently first, as your system may require a specific version (e.g. CUDA drivers).
After PyTorch installation, bio-attention
can be installed using pip
pip install bio-attention
Note
This package used the be a 2D sliding window attention package. The current formulation of the package does not allow for this type of attention anymore (instead, I recommend to perform axial attention with alternating sliding window attention across one axis and full self-attention across the other). If you want to use 2D sliding window attention, check out the old version of this repo.
Usage
Package roadmap
- Embedding layers
- Continuous
- Discrete
- Binary
- Bin
- [~] Positional encoding schemes
- Sinusoidal
- Embedding
- Continuous
- Rotary
- AliBi
- DPB
- XL
- Test support for multi-dimensional inputs
- [~] Attention modules
- Vanilla
- Windowed
- Random
- Performer
- Encoder
- Decoder
- Cross
- Support for multi-dim inputs
- Add a warning if non-increasing positional indices are used with a decoder attention
- Add docs clarifying that clf tokens are automatically accounted for if no pos is provided for them
- Tests
- Typing
- Docs
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for bio_attention-0.1.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 33d1fabccbfa4858e8e8cdf11a57edc2dcac282cb73560ee0150537a147128b8 |
|
MD5 | ea6a0e89f593d9423eeeb1c3167118e5 |
|
BLAKE2b-256 | 548302e14bf45a0c0d9e674893e2e625bde230fda5718bd7e4d585eb3d9601bf |