Skip to main content

Differentiable Neural Computer, for Pytorch

Project description

# subtractive LSTM (subLSTM), for Pytorch

[![Build Status](https://travis-ci.org/ixaxaar/pytorch-sublstm.svg?branch=master)](https://travis-ci.org/ixaxaar/pytorch-sublstm) [![PyPI version](https://badge.fury.io/py/pytorch-sublstm.svg)](https://badge.fury.io/py/pytorch-sublstm)

This is an implementation of subLSTM described in the paper [Cortical microcircuits as gated-recurrent neural networks, Rui Ponte Costa et al.](https://arxiv.org/abs/1711.02448)

## Install

```bash
pip install pytorch-sublstm
```


## Usage

**Parameters**:

Following are the constructor parameters:

| Argument | Default | Description |
| --- | --- | --- |
| input_size | `None` | Size of the input vectors |
| hidden_size | `None` | Size of hidden units |
| num_layers | `1` | Number of layers in the network |
| bias | `True` | Bias |
| batch_first | `False` | Whether data is fed batch first |
| dropout | `0` | Dropout between layers in the network |
| bidirectional | `False` | If the network is bidirectional |


### Example usage:

#### nn Interface
```python
import torch
from torch.autograd import Variable
from subLSTM.nn import SubLSTM

hidden_size = 20
input_size = 10
seq_len = 5
batch_size = 7
hidden = None

input = Variable(torch.randn(batch_size, seq_len, input_size))

rnn = SubLSTM(input_size, hidden_size, num_layers=2, bias=True, batch_first=True)

# forward pass
output, hidden = rnn(input, hidden)
```

#### Cell Interface

```python
import torch
from torch.autograd import Variable
from subLSTM.nn import SubLSTMCell

hidden_size = 20
input_size = 10
seq_len = 5
batch_size = 7
hidden = None

hx = Variable(torch.randn(batch_size, hidden_size))
cx = Variable(torch.randn(batch_size, hidden_size))

input = Variable(torch.randn(batch_size, input_size))

cell = SubLSTMCell(input_size, hidden_size, bias=True)
(hx, cx) = cell(input, (hx, cx))
```

### Tasks:

A language modeling task is included [here](./tasks/word_language_model/).
Refer to its [README](./tasks/word_language_model/README.md) for more info.


### Attributions:

A lot of the code is recycled from [pytorch](https://pytorch.org)


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-sublstm-0.0.2.tar.gz (5.9 kB view hashes)

Uploaded source

Built Distribution

pytorch_sublstm-0.0.2-py3-none-any.whl (8.2 kB view hashes)

Uploaded py3

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Huawei Huawei PSF Sponsor Microsoft Microsoft PSF Sponsor NVIDIA NVIDIA PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page