Skip to main content
Join the official 2019 Python Developers SurveyStart the survey!

Differentiable Neural Computer, for Pytorch

Project description

# subtractive LSTM (subLSTM), for Pytorch

[![Build Status](https://travis-ci.org/ixaxaar/pytorch-sublstm.svg?branch=master)](https://travis-ci.org/ixaxaar/pytorch-sublstm) [![PyPI version](https://badge.fury.io/py/pytorch-sublstm.svg)](https://badge.fury.io/py/pytorch-sublstm)

This is an implementation of subLSTM described in the paper [Cortical microcircuits as gated-recurrent neural networks, Rui Ponte Costa et al.](https://arxiv.org/abs/1711.02448)

## Install

```bash
pip install pytorch-sublstm
```


## Usage

**Parameters**:

Following are the constructor parameters:

| Argument | Default | Description |
| --- | --- | --- |
| input_size | `None` | Size of the input vectors |
| hidden_size | `None` | Size of hidden units |
| num_layers | `1` | Number of layers in the network |
| bias | `True` | Bias |
| batch_first | `False` | Whether data is fed batch first |
| dropout | `0` | Dropout between layers in the network |
| bidirectional | `False` | If the network is bidirectional |


### Example usage:

#### nn Interface
```python
import torch
from torch.autograd import Variable
from subLSTM.nn import SubLSTM

hidden_size = 20
input_size = 10
seq_len = 5
batch_size = 7
hidden = None

input = Variable(torch.randn(batch_size, seq_len, input_size))

rnn = SubLSTM(input_size, hidden_size, num_layers=2, bias=True, batch_first=True)

# forward pass
output, hidden = rnn(input, hidden)
```

#### Cell Interface

```python
import torch
from torch.autograd import Variable
from subLSTM.nn import SubLSTMCell

hidden_size = 20
input_size = 10
seq_len = 5
batch_size = 7
hidden = None

hx = Variable(torch.randn(batch_size, hidden_size))
cx = Variable(torch.randn(batch_size, hidden_size))

input = Variable(torch.randn(batch_size, input_size))

cell = SubLSTMCell(input_size, hidden_size, bias=True)
(hx, cx) = cell(input, (hx, cx))
```

### Tasks:

A language modeling task is included [here](./tasks/word_language_model/).
Refer to its [README](./tasks/word_language_model/README.md) for more info.


### Attributions:

A lot of the code is recycled from [pytorch](https://pytorch.org)


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for pytorch-sublstm, version 0.0.2
Filename, size File type Python version Upload date Hashes
Filename, size pytorch_sublstm-0.0.2-py3-none-any.whl (8.2 kB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size pytorch-sublstm-0.0.2.tar.gz (5.9 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page