Attention mechanism for processing sequence data that considers the context for each timestamp
Project description
Attention mechanism for processing sequence data that considers the context for each timestamp.
Install
pip install keras-self-attention
Usage
import keras
from keras_self_attention import Attention
model = keras.models.Sequential()
model.add(keras.layers.Embedding(input_dim=10000,
output_dim=300,
mask_zero=True))
model.add(keras.layers.Bidirectional(keras.layers.LSTM(units=128,
return_sequences=True)))
model.add(Attention())
model.add(keras.layers.Dense(units=5))
model.compile(
optimizer='adam',
loss='categorical_crossentropy',
metrics=['categorical_accuracy'],
)
model.summary()
attention_width
The global context may be too broad for one piece of data. The parameter attention_width controls the width of the local context.
attention_activation
The activation function of e_{t, t'}. There is no activation by default.
from keras_self_attention import Attention
Attention(
attention_width=15,
attention_activation='sigmoid',
name='Attention',
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Close
Hashes for keras-self-attention-0.0.11.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 765ce94dd2c748420ce834d2a8cbc7b14214f0a94525aedc5dccb5973f4ece49 |
|
MD5 | 47ef106a58bc96e6aff55f91dd1894c9 |
|
BLAKE2b-256 | 333c18a76ecb79c9ed85cb9120c6bb7226a1ad96829bf73779c0b2b1d6a87d92 |