Building attention mechanisms and Transformer models from scratch. Alias ATF.
Project description
Attention mechanisms and Transformers
- This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
- At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.
Installation
- Using pip to install from pypi
pip install Attention-and-Transformers
- Using pip to install latest version from github
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
- Local clone and install
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install
Example Use
python load_test.py
Attention Mechanisms
# No. | Mechanism | Paper |
---|---|---|
1 | Multi-head Self Attention | Attention is all you need |
2 | Multi-head Self Attention 2D | MobileViT V1 |
2 | Separable Self Attention | MobileViT V2 |
Transformer Models
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file Attention_and_Transformers-0.0.15.tar.gz
.
File metadata
- Download URL: Attention_and_Transformers-0.0.15.tar.gz
- Upload date:
- Size: 17.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 18de0593625a77b0dacff19e64ef77b6860e4e9a8d6f06e99f9448c127f0fd07 |
|
MD5 | c75f073989d43cbef4aea9d4950d427d |
|
BLAKE2b-256 | 0c5d1b143ff86a9182751ac0ddabac126afce2f1c0e6ee91e812f46d0c9d2b3f |
File details
Details for the file Attention_and_Transformers-0.0.15-py3-none-any.whl
.
File metadata
- Download URL: Attention_and_Transformers-0.0.15-py3-none-any.whl
- Upload date:
- Size: 24.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a32c67a0fcb200627baad4f66e7bcec4edc96771f1faf67d7af1c669ce139ae3 |
|
MD5 | e91cb98da61973197058849f34b4c2c8 |
|
BLAKE2b-256 | 862c83acacb0fa37c7e47809d896287e2440ba66682f4f948e423148dcca8482 |