Simple implementations of attention modules adapted for the biological data domain
Project description
:construction: THIS CODE IS BEING ACTIVELY DEVELOPED :construction:
Don't look for stability here (yet).
Why use this package?
There are already plenty of excellent implementations out there that allow you to test out the countless variants of transformers [1], [2]. This repository primarily separates itself from the previous in that it contains positional encodings schemes adapted to allow for irregularly-spaced positions in sequences.
Install
Since PyTorch is a dependency of bio-attention
, we recommend installing PyTorch independently first, as your system may require a specific version (e.g. CUDA drivers).
After PyTorch installation, bio-attention
can be installed using pip
pip install bio-attention
Note
This package used the be a 2D sliding window attention package. The current formulation of the package does not allow for this type of attention anymore (instead, I recommend to perform axial attention with alternating sliding window attention across one axis and full self-attention across the other). If you want to use 2D sliding window attention, check out the old version of this repo.
Usage
Package roadmap
- Embedding layers
- Continuous
- Discrete
- Binary
- Bin
- [~] Positional encoding schemes
- Sinusoidal
- Embedding
- Continuous
- Rotary
- AliBi
- DPB
- XL
- Test support for multi-dimensional inputs
- [~] Attention modules
- Vanilla
- Windowed
- Random
- Performer
- Encoder
- Decoder
- Cross
- Support for multi-dim inputs
- Tests
- Typing
- Docs
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for bio_attention-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c681a0e8449507752bd04e88029eacca712a2e46bfa7b3a0da0ddb13c2ae12e1 |
|
MD5 | 93dd1277abf05cc6495c6016d1c18874 |
|
BLAKE2b-256 | df1cef6cfff82d1e481c721332eda9eeac797d311b815867f71388cee8eaaf70 |