An einops-style generalized normalization layer.
Project description
einorm
An einops-style generalized normalization layer.
Installation
You need torch >= 1.13 or functorch to be installed:
pip install einorm
Usage
from einorm import Einorm
# Equivalent to nn.LayerNorm(1024)
Einorm("b n d", "d", d=1024)
# Specify the dimensions and sizes to normalize along.
Einorm("a b c d e", "b d", b=3, d=4)
According to ViT-22B, normalizing query and key in a head-wise fashion can help stabilize the training dynamics. This can be achieved by providing additional grouping arguments to Einorm:
Einorm("b h n d", "d", "h", h=16, d=64) # num_heads=16, head_dim=64
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
einorm-0.1.1.tar.gz
(3.8 kB
view details)
File details
Details for the file einorm-0.1.1.tar.gz.
File metadata
- Download URL: einorm-0.1.1.tar.gz
- Upload date:
- Size: 3.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.9.16 Linux/5.15.0-1033-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4302da9479607ae528747ded9bf29c6935ffdee77b3b7f7bec673a6983f1655b
|
|
| MD5 |
6968e2f5d5e4220aed83e55a2c6be448
|
|
| BLAKE2b-256 |
9ffd3c290a6a11c64ce0f44a61fa6f00f1f0714242320908e6716e948af3aa1a
|