Skip to main content

Building attention mechanisms and Transformer models from scratch. Alias ATF.

Project description

Attention mechanisms and Transformers

PyPI - Python Version PyPI version TensorFlow 2.10.0 TensorFlow

  • This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
  • At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.

Installation

  • Using pip to install from pypi
pip install Attention-and-Transformers
  • Using pip to install latest version from github
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
  • Local clone and install
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install

Test Installation

python load_test.py

Attention Mechanisms

# No. Mechanism Paper
1 Multi-head Self Attention Attention is all you need
2 Multi-head Self Attention 2D MobileViT V1
2 Separable Self Attention MobileViT V2

Transformer Models

# No. Models Paper
1 Vision Transformer An Image is Worth 16x16 Words:
2 MobileViT-V1 MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
3 MobileViT-V2 (under development) Separable Self-attention for Mobile Vision Transformers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Attention_and_Transformers-0.0.13.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

Attention_and_Transformers-0.0.13-py3-none-any.whl (21.3 kB view details)

Uploaded Python 3

File details

Details for the file Attention_and_Transformers-0.0.13.tar.gz.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.13.tar.gz
Algorithm Hash digest
SHA256 159d38036aa1d465efcf01567ee068409b3b715caeabc3f1bfc232ccbb1d5536
MD5 97c440174e5f6e1e9c8264a5b6205498
BLAKE2b-256 8298be8e4a64717d300409d12bb75c6768db3851960ac1b72732149dccfae29e

See more details on using hashes here.

File details

Details for the file Attention_and_Transformers-0.0.13-py3-none-any.whl.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 f48381fee0b416b331498405213b27d9b76b7b5d8739fa4bb7f666b5d5a1b47e
MD5 cea6d116e8a4f9e46ae110c42ec04a5e
BLAKE2b-256 5be99bf0c2a047aa23ac273b7475757d392edcb467f4a5990a624ccfb27bd063

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page