Skip to main content

Building attention mechanisms and Transformer models from scratch. Alias ATF.

Project description

Attention mechanisms and Transformers

PyPI - Python Version PyPI version TensorFlow 2.10.0 TensorFlow

  • This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
  • At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.

Installation

  • Using pip to install from pypi
pip install Attention-and-Transformers
  • Using pip to install latest version from github
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
  • Local clone and install
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install

Test Installation

python load_test.py

Attention Mechanisms

# No. Mechanism Paper
1 Multi-head Self Attention Attention is all you need
2 Multi-head Self Attention 2D MobileViT V1
2 Separable Self Attention MobileViT V2

Transformer Models

# No. Models Paper
1 Vision Transformer An Image is Worth 16x16 Words:
2 MobileViT-V1 MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
3 MobileViT-V2 (under development) Separable Self-attention for Mobile Vision Transformers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Attention_and_Transformers-0.0.14.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

Attention_and_Transformers-0.0.14-py3-none-any.whl (21.3 kB view details)

Uploaded Python 3

File details

Details for the file Attention_and_Transformers-0.0.14.tar.gz.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.14.tar.gz
Algorithm Hash digest
SHA256 1c0d495d9576a727cca5b2abc2137a80138de2e10430a566e79cc65dfd96bad5
MD5 4a6d9d7feb0cbed48d3e00cc1c8be2aa
BLAKE2b-256 5704b11bd785472a5ca2f7a1133c04fd8c9421cd7d1207f331c2d2f47598699a

See more details on using hashes here.

File details

Details for the file Attention_and_Transformers-0.0.14-py3-none-any.whl.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.14-py3-none-any.whl
Algorithm Hash digest
SHA256 dddce00a839ac0ba029af7aa3d6f761b31880ade165a6c37de7e26a9069111b4
MD5 86ae8b919eff9d6268637e9b93bb2001
BLAKE2b-256 13da1c650d093d45772a2883d0465ea6a50bdb16d4142a85520517e1a52c0b6b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page