None
Project description
pytorch-stateful-lstm
Free software: MIT license
Features
Pytorch LSTM implementation powered by Libtorch, and with the support of:
Hidden/Cell Clip.
Skip Connections.
Variational Dropout & DropConnect.
Managed Initial State.
Built-in TBPTT.
Benchmark: https://github.com/cnt-dev/pytorch-stateful-lstm/tree/master/benchmark
Install
Prerequisite: torch>=1.0.0, supported C++11 compiler (see here). To install through pip:
pip install pytorch-stateful-lstm
Usage
Example:
import torch from torch.nn.utils.rnn import pack_padded_sequence, PackedSequence from pytorch_stateful_lstm import StatefulUnidirectionalLstm lstm = StatefulUnidirectionalLstm( num_layers=2, input_size=3, hidden_size=5, cell_size=7, ) inputs = pack_padded_sequence(torch.rand(4, 5, 3), [5, 4, 2, 1], batch_first=True) raw_packed_outputs, lstm_state = lstm( inputs.data, inputs.batch_sizes ) outputs = PackedSequence(raw_packed_outputs, inputs.batch_sizes)
For the definition of parameters, see https://github.com/cnt-dev/pytorch-stateful-lstm/tree/master/extension.
Credits
This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.
History
0.1.0 (2019-01-03)
First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Hashes for pytorch_stateful_lstm-1.5.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | a5981e1bfd49365facdd813c7735845104f128642e5792114183f56fba4e97e5 |
|
MD5 | a00ec045fd5d8114e219dbce169d8b27 |
|
BLAKE2b-256 | 8403aafe51a1584632017b0996dfd07683f9ebf80e3aed715dae5f02b6fe30f5 |