Skip to main content

Simple implementations of attention modules adapted for the biological data domain

Project description

bio-attention

Simple implementations of attention modules adapted for the biological data domain.

PyPi Version GitHub license Documentation

:construction: THIS CODE IS BEING ACTIVELY DEVELOPED :construction:

Don't look for stability here (yet).

Why use this package?

There are already plenty of excellent implementations out there that allow you to test out the countless variants of transformers [1], [2]. This repository primarily separates itself from the previous in that it contains positional encodings schemes adapted to allow for irregularly-spaced positions in sequences.

Install

Since PyTorch is a dependency of bio-attention, we recommend installing PyTorch independently first, as your system may require a specific version (e.g. CUDA drivers).

After PyTorch installation, bio-attention can be installed using pip

pip install bio-attention

Note

This package used the be a 2D sliding window attention package. The current formulation of the package does not allow for this type of attention anymore (instead, I recommend to perform axial attention with alternating sliding window attention across one axis and full self-attention across the other). If you want to use 2D sliding window attention, check out the old version of this repo.

Usage

Package roadmap

  • Embedding layers
    • Continuous
    • Discrete
    • Binary
    • Bin
  • [~] Positional encoding schemes
    • Sinusoidal
    • Embedding
    • Continuous
    • Rotary
    • AliBi
    • DPB
    • XL
    • Test support for multi-dimensional inputs
  • [~] Attention modules
    • Vanilla
    • Windowed
    • Random
    • Performer
    • Encoder
    • Decoder
    • Cross
    • Support for multi-dim inputs
  • Add a warning if non-increasing positional indices are used with a decoder attention
  • Add docs clarifying that clf tokens are automatically accounted for if no pos is provided for them
  • Tests
  • Typing
  • Docs

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bio-attention-0.1.5.tar.gz (2.2 MB view hashes)

Uploaded Source

Built Distribution

bio_attention-0.1.5-py3-none-any.whl (16.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page