Position embedding layers in Keras
Project description
Position embedding layers in Keras.
Install
pip install keras-pos-embd
Usage
Trainable Embedding
import keras
from keras_pos_embd import PositionEmbedding
model = keras.models.Sequential()
model.add(PositionEmbedding(
input_shape=(None,),
input_dim=10, # The maximum absolute value of positions.
output_dim=2, # The dimension of embeddings.
mask_zero=10000, # The index that presents padding (because `0` will be used in relative positioning).
name='Pos-Embd',
))
model.compile('adam', keras.losses.mae, {})
model.summary()
(Note that you don’t need to enable mask_zero if you would concatenate other layers like word embeddings with masks)
Sin & Cos Embedding
The sine and cosine embedding has no trainable weights. The layer has two modes, it works just like PositionEmbedding in expand mode:
import keras
from keras_pos_embd import TrigPosEmbedding
model = keras.models.Sequential()
model.add(TrigPosEmbedding(
input_shape=(None,),
output_dim=30, # The dimension of embeddings.
mode=TrigPosEmbedding.MODE_EXPAND, # Use `expand` mode
name='Pos-Embd',
))
model.compile('adam', keras.losses.mae, {})
model.summary()
If you want to add this embedding to existed embedding, then there is no need to add a position input in add mode:
import keras
from keras_pos_embd import TrigPosEmbedding
model = keras.models.Sequential()
model.add(TrigPosEmbedding(
input_shape=(None, 100),
mode=TrigPosEmbedding.MODE_ADD, # Use `add` mode (default)
name='Pos-Embd',
))
model.compile('adam', keras.losses.mae, {})
model.summary()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
keras-pos-embd-0.4.0.tar.gz
(4.3 kB
view hashes)