Skip to main content

Building attention mechanisms and Transformer models from scratch. Alias ATF.

Project description

Attention mechanisms and Transformers

PyPI - Python Version TensorFlow PyPI version TensorFlow

  • This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
  • At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.

Installation

  • Using pip to install from pypi
pip install Attention-and-Transformers
  • Using pip to install latest version from github
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
  • Local clone and install
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install

Example Use

python load_test.py

Attention Mechanisms

# No. Mechanism Paper
1 Multi-head Self Attention Attention is all you need
2 Multi-head Self Attention 2D MobileViT V1
2 Separable Self Attention MobileViT V2

Transformer Models

# No. Models Paper
1 Vision Transformer An Image is Worth 16x16 Words:
2 MobileViT-V1 MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
3 MobileViT-V2 Separable Self-attention for Mobile Vision Transformers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Attention_and_Transformers-0.0.15.tar.gz (17.5 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file Attention_and_Transformers-0.0.15.tar.gz.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.15.tar.gz
Algorithm Hash digest
SHA256 18de0593625a77b0dacff19e64ef77b6860e4e9a8d6f06e99f9448c127f0fd07
MD5 c75f073989d43cbef4aea9d4950d427d
BLAKE2b-256 0c5d1b143ff86a9182751ac0ddabac126afce2f1c0e6ee91e812f46d0c9d2b3f

See more details on using hashes here.

File details

Details for the file Attention_and_Transformers-0.0.15-py3-none-any.whl.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.15-py3-none-any.whl
Algorithm Hash digest
SHA256 a32c67a0fcb200627baad4f66e7bcec4edc96771f1faf67d7af1c669ce139ae3
MD5 e91cb98da61973197058849f34b4c2c8
BLAKE2b-256 862c83acacb0fa37c7e47809d896287e2440ba66682f4f948e423148dcca8482

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page