None
Project description
pytorch-stateful-lstm
Free software: MIT license
Features
Pytorch LSTM implementation powered by Libtorch, and with the support of:
Hidden/Cell Clip.
Skip Connections.
Variational Dropout & DropConnect.
Managed Initial State.
Built-in TBPTT.
Benchmark: https://github.com/cnt-dev/pytorch-stateful-lstm/tree/master/benchmark
Install
Prerequisite: torch>=1.0.0, supported C++11 compiler (see here). To install through pip:
pip install pytorch-stateful-lstm
Usage
Example:
import torch from torch.nn.utils.rnn import pack_padded_sequence, PackedSequence from pytorch_stateful_lstm import StatefulUnidirectionalLstm lstm = StatefulUnidirectionalLstm( num_layers=2, input_size=3, hidden_size=5, cell_size=7, ) inputs = pack_padded_sequence(torch.rand(4, 5, 3), [5, 4, 2, 1], batch_first=True) raw_packed_outputs, lstm_state = lstm( inputs.data, inputs.batch_sizes ) outputs = PackedSequence(raw_packed_outputs, inputs.batch_sizes)
For the definition of parameters, see https://github.com/cnt-dev/pytorch-stateful-lstm/tree/master/extension.
Credits
This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.
History
0.1.0 (2019-01-03)
First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Hashes for pytorch_stateful_lstm-1.3.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 91336407a246e4a4c6fb3bdcd2853cf7342bf3628712f33eef8fed631564d1ff |
|
MD5 | a41db518db740b4f67e2457080112de9 |
|
BLAKE2b-256 | 4d082f506129e3a31dbb61c7ad282614146dfe7ba9e40f1bb59a9eab917a5e95 |