Skip to main content

Building attention mechanisms and Transformer models from scratch. Alias ATF.

Project description

Attention mechanisms and Transformers

PyPI - Python Version PyPI version TensorFlow 2.10.0 TensorFlow

  • This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
  • At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.

Installation

  • Using pip to install from pypi
pip install Attention-and-Transformers
  • Using pip to install latest version from github
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
  • Local clone and install
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install

Test Installation

python load_test.py

Attention Mechanisms

# No. Mechanism Paper
1 Multi-head Self Attention Attention is all you need
2 Multi-head Self Attention 2D MobileViT V1
2 Separable Self Attention MobileViT V2

Transformer Models

# No. Models Paper
1 Vision Transformer An Image is Worth 16x16 Words:
2 MobileViT-V1 MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
3 MobileViT-V2 (under development) Separable Self-attention for Mobile Vision Transformers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Attention_and_Transformers-0.0.10.tar.gz (13.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

Attention_and_Transformers-0.0.10-py3-none-any.whl (21.3 kB view details)

Uploaded Python 3

File details

Details for the file Attention_and_Transformers-0.0.10.tar.gz.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.10.tar.gz
Algorithm Hash digest
SHA256 7b673233332d09e1de052409e4db5f574bdcbacc279b6c07505d9abb5e96244b
MD5 5613f36a8f3b8cacec42db9fd5b1b5c7
BLAKE2b-256 9f7a1abf3baa59e239e973debf05df235b3835c012046fef0c1c473cb6cb16bf

See more details on using hashes here.

File details

Details for the file Attention_and_Transformers-0.0.10-py3-none-any.whl.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 d2b07a813a567cf913654ac1bd6d3d5c542a5ce3d262bfbfdec1129a51139fa3
MD5 56e1c802adee0d45897a6dd26dbbebc3
BLAKE2b-256 8f05b1c5c4a9192fa46062e2ba9ba28ed018366d106914ba6fafcace033128d4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page