Skip to main content

TensorFlow-compatible Transformer layers and models.

Project description

# maximal

See the [Official Documentation site](https://ivanbongiorni.github.io/maximal/)

Current version: 1.0

A TensorFlow-compatible Python library that provides models and layers to implement custom Transformer neural networks.

Built on TensorFlow 2.

<a href=”url”><img src=”https://github.com/IvanBongiorni/maximal/blob/main/utils/maximal_stablediffusion_00.png” align=”center”></a> <br> Logo generated by Stable Diffusion 2.1 <br>

# Installation Its installation is straightforward:

` pip install maximal `

# How to use it? maximal is commonly called as:

` import maximal from maximal.layers import TransformerLayer, GPTLayer `

and can be used in a tf.keras model as any common layer.

# Documentation An [Official Website](https://ivanbongiorni.github.io/maximal/) is now available with documentation and tutorials.

# Elements

In layers.py: - SelfAttention: keras.Layer, computes Scaled Dot-Product Attention.

  • MultiHeadSelfAttention: keras.Layer, it is a concatenation of SelfAttention layers, resized back to original input shape through linear transformation.

  • PositionalEmbedding: keras.Layer, implements double Embedding layers used in Transformers literature, for tokens and positions. Positional encoding is learned through a tf.keras.layers.Embedding() layer, instead of deterministic positional encoding in the original paper.

  • TransformerLayer: keras.Layer single Transformer Encoder piece. It can be used inside any Sequential() model in Keras.

  • GPTLayer: keras.Layer GPT block. Similar to TransformerLayer but with causal Attention mechanism. It can be used inside any Sequential() model in Keras.

In schedules.py: - OriginalTransformerSchedule: keras.Layer implements the learning rate schedule of the original Transformer paper. It is taken from this [official TensorFlow tutorial](https://www.tensorflow.org/text/tutorials/transformer).

# Requirements ` numpy tensorflow >= 2.0 `

# Author Ivan Bongiorni. [LinkedIn](https://www.linkedin.com/in/ivan-bongiorni-b8a583164/)

# License 2020 Ivan Bongiorni

This repository is licensed under the MIT license. See [LICENCE.txt]() for further details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

maximal-1.1.tar.gz (8.7 kB view details)

Uploaded Source

File details

Details for the file maximal-1.1.tar.gz.

File metadata

  • Download URL: maximal-1.1.tar.gz
  • Upload date:
  • Size: 8.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.13

File hashes

Hashes for maximal-1.1.tar.gz
Algorithm Hash digest
SHA256 79f8f22ae130439e8b5f1835733195246481c0f2a4115323e0d51d5ff01b56a1
MD5 774c5f21b6a582c2450dc092b2502045
BLAKE2b-256 7f1439252906625363f4d8263e9f8aa2ef3267043c61f2fda03f12c2fc7226ad

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page