Position embedding layers in Keras
Project description
Position embedding layers in Keras.
Install
pip install keras-pos-embd
Usage
import keras
from keras_pos_embd import PositionEmbedding
model = keras.models.Sequential()
model.add(PositionEmbedding(
input_dim=10, # The maximum absolute value of positions.
output_dim=2, # The dimension of embeddings.
mask_zero=10000, # The index that presents padding (because `0` will be used in relative positioning).
input_shape=(None,),
name='Pos-Embd',
))
(Note that you don’t need to enable mask_zero if you would concatenate other layers like word embeddings with masks)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
keras-pos-embd-0.3.tar.gz
(2.6 kB
view hashes)