Skip to main content

Position embedding layers in Keras

Project description

Keras Position Embedding

Version

[中文|English]

Position embedding layers in Keras.

Install

pip install keras-pos-embd

Usage

Trainable Embedding

from tensorflow import keras
from keras_pos_embd import PositionEmbedding

model = keras.models.Sequential()
model.add(PositionEmbedding(
    input_shape=(None,),
    input_dim=10,     # The maximum absolute value of positions.
    output_dim=2,     # The dimension of embeddings.
    mask_zero=10000,  # The index that presents padding (because `0` will be used in relative positioning).
    mode=PositionEmbedding.MODE_EXPAND,
))
model.compile('adam', 'mse')
model.summary()

Note that you don't need to enable mask_zero if you want to add/concatenate other layers like word embeddings with masks:

from tensorflow import keras
from keras_pos_embd import PositionEmbedding

model = keras.models.Sequential()
model.add(keras.layers.Embedding(
    input_shape=(None,),
    input_dim=10,
    output_dim=5,
    mask_zero=True,
))
model.add(PositionEmbedding(
    input_dim=100,
    output_dim=5,
    mode=PositionEmbedding.MODE_ADD,
))
model.compile('adam', 'mse')
model.summary()

Sin & Cos Embedding

The sine and cosine embedding has no trainable weights. The layer has three modes, it works just like PositionEmbedding in expand mode:

from tensorflow import keras
from keras_pos_embd import TrigPosEmbedding

model = keras.models.Sequential()
model.add(TrigPosEmbedding(
    input_shape=(None,),
    output_dim=30,                      # The dimension of embeddings.
    mode=TrigPosEmbedding.MODE_EXPAND,  # Use `expand` mode
))
model.compile('adam', 'mse')
model.summary()

If you want to add this embedding to existed embedding, then there is no need to add a position input in add mode:

from tensorflow import keras
from keras_pos_embd import TrigPosEmbedding

model = keras.models.Sequential()
model.add(keras.layers.Embedding(
    input_shape=(None,),
    input_dim=10,
    output_dim=5,
    mask_zero=True,
))
model.add(TrigPosEmbedding(
    output_dim=5,
    mode=TrigPosEmbedding.MODE_ADD,
))
model.compile('adam', 'mse')
model.summary()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

keras-pos-embd-0.13.0.tar.gz (5.6 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page