None
Project description
pytorch-stateful-lstm
Free software: MIT license
Features
Pytorch LSTM implementation powered by Libtorch, and with the support of:
Hidden/Cell Clip.
Skip Connections.
Variational Dropout & DropConnect.
Managed Initial State.
Built-in TBPTT.
Benchmark: https://github.com/cnt-dev/pytorch-stateful-lstm/tree/master/benchmark
Install
Prerequisite: torch>=1.0.0, supported C++11 compiler (see here). To install through pip:
pip install pytorch-stateful-lstm
Usage
Example:
import torch from torch.nn.utils.rnn import pack_padded_sequence, PackedSequence from pytorch_stateful_lstm import StatefulUnidirectionalLstm lstm = StatefulUnidirectionalLstm( num_layers=2, input_size=3, hidden_size=5, cell_size=7, ) inputs = pack_padded_sequence(torch.rand(4, 5, 3), [5, 4, 2, 1], batch_first=True) raw_packed_outputs, lstm_state = lstm( inputs.data, inputs.batch_sizes ) outputs = PackedSequence(raw_packed_outputs, inputs.batch_sizes)
For the definition of parameters, see https://github.com/cnt-dev/pytorch-stateful-lstm/tree/master/extension.
Credits
This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.
History
0.1.0 (2019-01-03)
First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Hashes for pytorch_stateful_lstm-1.5.2.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | b4835d1be62d089215a5b7eb7d5c062604090c56099d9bfeb26d791ec4582388 |
|
MD5 | d1f94a3547d2c6e8f66f0c6faba94a96 |
|
BLAKE2b-256 | e602b49a3dff1b3fe741d6acf80ff1b9f0eb5c8ed4fd99ec5845d98b2560c9f7 |