Skip to main content

Building attention mechanisms and Transformer models from scratch. Alias ATF.

Project description

Attention mechanisms and Transformers

PyPI - Python Version PyPI version TensorFlow 2.10.0 TensorFlow

  • This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
  • At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.

Installation

  • Using pip to install from pypi
pip install Attention-and-Transformers
  • Using pip to install latest version from github
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
  • Local clone and install
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install

Test Installation

python load_test.py

Attention Mechanisms

# No. Mechanism Paper
1 Multi-head Self Attention Attention is all you need
2 Multi-head Self Attention 2D MobileViT V1
2 Separable Self Attention MobileViT V2

Transformer Models

# No. Models Paper
1 Vision Transformer An Image is Worth 16x16 Words:
2 MobileViT-V1 MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
3 MobileViT-V2 (under development) Separable Self-attention for Mobile Vision Transformers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Attention_and_Transformers-0.0.11.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

Attention_and_Transformers-0.0.11-py3-none-any.whl (21.3 kB view details)

Uploaded Python 3

File details

Details for the file Attention_and_Transformers-0.0.11.tar.gz.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.11.tar.gz
Algorithm Hash digest
SHA256 ceb3d7abe776eaed46ab66d0eab96c29f83483d5a60c8fbe8cbf8dd6ef30ba2d
MD5 d4a1f8f6e425bd55e10753545c59f7e3
BLAKE2b-256 7c103360d9076daca7e49e4107b622cf24d0c3f3397352c4dcf916ebb8a4e72d

See more details on using hashes here.

File details

Details for the file Attention_and_Transformers-0.0.11-py3-none-any.whl.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 82e988670beb8c8645d322857f2fc30a64f5f54beeea1fba269e6ca773083a31
MD5 9cb610c56dfae9eab23f6e6592bcaf99
BLAKE2b-256 301fdc2cc3597f3e3b4014c79ec1a3e20cefd6dc0b47174da01333378be7132c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page