Skip to main content

Building attention mechanisms and Transformer models from scratch. Alias ATF.

Project description

Attention mechanisms and Transformers

PyPI - Python Version PyPI version TensorFlow 2.10.0 TensorFlow

  • This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
  • At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.

Installation

  • Using pip to install from pypi
pip install Attention-and-Transformers
  • Using pip to install latest version from github
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
  • Local clone and install
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install

Test Installation

python load_test.py

Attention Mechanisms

# No. Mechanism Paper
1 Multi-head Self Attention Attention is all you need
2 Multi-head Self Attention 2D MobileViT V1
2 Separable Self Attention MobileViT V2

Transformer Models

# No. Models Paper
1 Vision Transformer An Image is Worth 16x16 Words:
2 MobileViT-V1 MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
3 MobileViT-V2 (under development) Separable Self-attention for Mobile Vision Transformers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Attention_and_Transformers-0.0.8.tar.gz (13.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

Attention_and_Transformers-0.0.8-py3-none-any.whl (21.2 kB view details)

Uploaded Python 3

File details

Details for the file Attention_and_Transformers-0.0.8.tar.gz.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.8.tar.gz
Algorithm Hash digest
SHA256 ef774d1590623d449d452d79bf6b5a148b3608ad1647578cf227f97b353d6616
MD5 221c1e0eb86cf19d518f497f6475f45f
BLAKE2b-256 a8988406d632ac2a38e399fb04aeb4a01f332252b4a4f04e634beea3fa52be28

See more details on using hashes here.

File details

Details for the file Attention_and_Transformers-0.0.8-py3-none-any.whl.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 1898fb1e687bd72eea610fed3735cca6a2f4c9227a414dfb2932cc26dcdb8555
MD5 5955565e6c75118f26377d5a60f13f70
BLAKE2b-256 ada81aa838210bf384b8aeaf4009420e6c0a8f195decd8f59a411dee5ad26d67

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page