Skip to main content

Building attention mechanisms and Transformer models from scratch. Alias ATF. https://github.com/veb-101/Attention-and-Transformers

Project description

Attention mechanisms and Transformers

Python 3.10.4 TensorFlow 2.10.0 TensorFlow

  • This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.

  • At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.

Attention Mechanisms

# No.

Mechanism

Paper

1

Multi-head Self Attention

Attention is all you need

2

Multi-head Self Attention 2D

MobileViT V1

Transformer Models

# No.

Models

Paper

1

Vision Transformer

An Image is Worth 16x16 Words:

2

MobileViT-V1

MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer

3 MobileViT-V2 (under development)

Separable Self-attention for Mobile Vision Transformers

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Attention_and_Transformers-0.0.1.tar.gz (7.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

Attention_and_Transformers-0.0.1-py3-none-any.whl (8.4 kB view details)

Uploaded Python 3

File details

Details for the file Attention_and_Transformers-0.0.1.tar.gz.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.1.tar.gz
Algorithm Hash digest
SHA256 34559a79d900db5e40f3056e34a1d733d95b249bbf4f2c3d9df8c4db3e259a56
MD5 c8511ddbbafc8f9bd4819debf82c1bcf
BLAKE2b-256 5f11abb921146cdd95edd31e8b51a35354a246b9f3e141ea614c8c71ffdc2a7d

See more details on using hashes here.

File details

Details for the file Attention_and_Transformers-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for Attention_and_Transformers-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3035de547414eb7fc292a4e4e3dcb4bc4b57aa8bb544a1cd4e625eb647f23069
MD5 c3613d59a4186f4f13fa17dfbb08999c
BLAKE2b-256 8ff08046683eb8b6738c8fafa41c8f6ab317d2331000fa5303fd713134b87ee2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page