Simple implementations of attention modules adapted for the biological data domain
Project description
:construction: THIS CODE IS BEING ACTIVELY DEVELOPED :construction:
Don't look for stability here (yet).
Why use this package?
There are already plenty of excellent implementations out there that allow you to test out the countless variants of transformers [1], [2]. This repository primarily separates itself from the previous in that it contains positional encodings schemes adapted to allow for irregularly-spaced positions in sequences.
Install
Since PyTorch is a dependency of bio-attention
, we recommend installing PyTorch independently first, as your system may require a specific version (e.g. CUDA drivers).
After PyTorch installation, bio-attention
can be installed using pip
pip install bio-attention
Note
This package used the be a 2D sliding window attention package. The current formulation of the package does not allow for this type of attention anymore (instead, I recommend to perform axial attention with alternating sliding window attention across one axis and full self-attention across the other). If you want to use 2D sliding window attention, check out the old version of this repo.
Usage
Package roadmap
- Embedding layers
- Continuous
- Discrete
- Binary
- Bin
- [~] Positional encoding schemes
- Sinusoidal
- Embedding
- Continuous
- Rotary
- AliBi
- DPB
- XL
- Test support for multi-dimensional inputs
- [~] Attention modules
- Vanilla
- Windowed
- Random
- Performer
- Encoder
- Decoder
- Cross
- Support for multi-dim inputs
- Add a warning if non-increasing positional indices are used with a decoder attention
- Add docs clarifying that clf tokens are automatically accounted for if no pos is provided for them
- Tests
- Typing
- Docs
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for bio_attention-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 190d1c02f933d4f27db52f82a7b3ef76c458be0abe1f0881fa080f2d3b838352 |
|
MD5 | 4fd87a51f1972714bc93eba5eb97ba58 |
|
BLAKE2b-256 | 20db49d21d0833cda10a180c7acc18de5dc8133cad79c1d606123be49a825156 |