Skip to main content

Position embedding layers in Keras

Project description

Keras Position Embedding

Version

[中文|English]

Position embedding layers in Keras.

Install

pip install keras-pos-embd

Usage

Trainable Embedding

from tensorflow import keras
from keras_pos_embd import PositionEmbedding

model = keras.models.Sequential()
model.add(PositionEmbedding(
    input_shape=(None,),
    input_dim=10,     # The maximum absolute value of positions.
    output_dim=2,     # The dimension of embeddings.
    mask_zero=10000,  # The index that presents padding (because `0` will be used in relative positioning).
    mode=PositionEmbedding.MODE_EXPAND,
))
model.compile('adam', 'mse')
model.summary()

Note that you don't need to enable mask_zero if you want to add/concatenate other layers like word embeddings with masks:

from tensorflow import keras
from keras_pos_embd import PositionEmbedding

model = keras.models.Sequential()
model.add(keras.layers.Embedding(
    input_shape=(None,),
    input_dim=10,
    output_dim=5,
    mask_zero=True,
))
model.add(PositionEmbedding(
    input_dim=100,
    output_dim=5,
    mode=PositionEmbedding.MODE_ADD,
))
model.compile('adam', 'mse')
model.summary()

Sin & Cos Embedding

The sine and cosine embedding has no trainable weights. The layer has three modes, it works just like PositionEmbedding in expand mode:

from tensorflow import keras
from keras_pos_embd import TrigPosEmbedding

model = keras.models.Sequential()
model.add(TrigPosEmbedding(
    input_shape=(None,),
    output_dim=30,                      # The dimension of embeddings.
    mode=TrigPosEmbedding.MODE_EXPAND,  # Use `expand` mode
))
model.compile('adam', 'mse')
model.summary()

If you want to add this embedding to existed embedding, then there is no need to add a position input in add mode:

from tensorflow import keras
from keras_pos_embd import TrigPosEmbedding

model = keras.models.Sequential()
model.add(keras.layers.Embedding(
    input_shape=(None,),
    input_dim=10,
    output_dim=5,
    mask_zero=True,
))
model.add(TrigPosEmbedding(
    output_dim=5,
    mode=TrigPosEmbedding.MODE_ADD,
))
model.compile('adam', 'mse')
model.summary()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

keras-pos-embd-0.13.0.tar.gz (5.6 kB view details)

Uploaded Source

File details

Details for the file keras-pos-embd-0.13.0.tar.gz.

File metadata

  • Download URL: keras-pos-embd-0.13.0.tar.gz
  • Upload date:
  • Size: 5.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.4

File hashes

Hashes for keras-pos-embd-0.13.0.tar.gz
Algorithm Hash digest
SHA256 09b0faf747e7ec71940a2888b1caee60641ed37cc247b6d573d979f89d42ef20
MD5 2402d7454dddde0a56225bbaf75f89b2
BLAKE2b-256 a3f08803f9ac4cddd9d2640347bde2f451d7d33bbc99d761f2bc00fb15911bf6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page