Skip to main content

A bunch of transformer implementations

Project description

Transformer Implementations

License PyPi Version PyPi Downloads Package Status

Transformer Implementations and some examples with them

Implemented:

  • Vanilla Transformer
  • ViT - Vision Transformers
  • DeiT - Data efficient image Transformers
  • BERT - Bidirectional Encoder Representations from Transformers
  • GPT - Generative Pre-trained Transformer

Installation

PyPi

$ pip install transformer-implementations

or

python setup.py build
python setup.py install

Example

In notebooks directory there is a notebook on how to use each of these models for their intented use; such as image classification for Vision Transformer (ViT) and others. Check them out!

from transformer_package.models import ViT

image_size = 28 # Model Parameters
channel_size = 1
patch_size = 7
embed_size = 512
num_heads = 8
classes = 10
num_layers = 3
hidden_size = 256
dropout = 0.2

model = ViT(image_size, 
            channel_size, 
            patch_size, 
            embed_size, 
            num_heads, 
            classes, 
            num_layers, 
            hidden_size, 
            dropout=dropout).to(DEVICE)

prediction = model(image_tensor)

Language Translation

from "Attention is All You Need": https://arxiv.org/pdf/1706.03762.pdf

Models trained with Implementation:

Multi-class Image Classification with Vision Transformers (ViT)

from "An Image is Worth 16x16 words: Transformers for image recognition at scale": https://arxiv.org/pdf/2010.11929v1.pdf

Models trained with Implementation:

Note: ViT will not perform great on small datasets

Multi-class Image Classification with Data-efficient image Transformers (DeiT)

from "Training data-efficient image transformers & distillation through attention": https://arxiv.org/pdf/2012.12877v1.pdf

Models trained with Implementation:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

transformer_implementations-0.0.9.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

transformer_implementations-0.0.9-py3-none-any.whl (9.4 kB view details)

Uploaded Python 3

File details

Details for the file transformer_implementations-0.0.9.tar.gz.

File metadata

  • Download URL: transformer_implementations-0.0.9.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.8.10

File hashes

Hashes for transformer_implementations-0.0.9.tar.gz
Algorithm Hash digest
SHA256 6d5a72d4b34646bd9c42b1642bded6f26c48a4c10e2cf40fd196179e40062aa4
MD5 74a18ef7be71066ea3735b7bba4d6816
BLAKE2b-256 c64d6cdb70e02fe41041c7bf2727c42284aa610b73676ed7a7e9628e5d40de6a

See more details on using hashes here.

File details

Details for the file transformer_implementations-0.0.9-py3-none-any.whl.

File metadata

  • Download URL: transformer_implementations-0.0.9-py3-none-any.whl
  • Upload date:
  • Size: 9.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.8.10

File hashes

Hashes for transformer_implementations-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 7218bfe9f0d5a4f507d2c2b72e0f387cf826cd14c1817d49a62d75c833b77a08
MD5 12503fe48b648c3085a976de6a214130
BLAKE2b-256 7445632c964b1dffdbc4157781702bdda34d67c7fcbe2cbce0885eaccb7fdde3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page