Skip to main content

Differentiable Neural Computer, for Pytorch

Project description

# subtractive LSTM (subLSTM), for Pytorch

[![Build Status](https://travis-ci.org/ixaxaar/pytorch-sublstm.svg?branch=master)](https://travis-ci.org/ixaxaar/pytorch-sublstm) [![PyPI version](https://badge.fury.io/py/pytorch-sublstm.svg)](https://badge.fury.io/py/pytorch-sublstm)

This is an implementation of subLSTM described in the paper [Cortical microcircuits as gated-recurrent neural networks, Rui Ponte Costa et al.](https://arxiv.org/abs/1711.02448)

## Install

```bash
pip install pytorch-sublstm
```


## Usage

**Parameters**:

Following are the constructor parameters:

| Argument | Default | Description |
| --- | --- | --- |
| input_size | `None` | Size of the input vectors |
| hidden_size | `None` | Size of hidden units |
| num_layers | `1` | Number of layers in the network |
| bias | `True` | Bias |
| batch_first | `False` | Whether data is fed batch first |
| dropout | `0` | Dropout between layers in the network |
| bidirectional | `False` | If the network is bidirectional |


### Example usage:

#### nn Interface
```python
import torch
from torch.autograd import Variable
from subLSTM.nn import SubLSTM

hidden_size = 20
input_size = 10
seq_len = 5
batch_size = 7
hidden = None

input = Variable(torch.randn(batch_size, seq_len, input_size))

rnn = SubLSTM(input_size, hidden_size, num_layers=2, bias=True, batch_first=True)

# forward pass
output, hidden = rnn(input, hidden)
```

#### Cell Interface

```python
import torch
from torch.autograd import Variable
from subLSTM.nn import SubLSTMCell

hidden_size = 20
input_size = 10
seq_len = 5
batch_size = 7
hidden = None

hx = Variable(torch.randn(batch_size, hidden_size))
cx = Variable(torch.randn(batch_size, hidden_size))

input = Variable(torch.randn(batch_size, input_size))

cell = SubLSTMCell(input_size, hidden_size, bias=True)
(hx, cx) = cell(input, (hx, cx))
```

### Tasks:

A language modeling task is included [here](./tasks/word_language_model/).
Refer to its [README](./tasks/word_language_model/README.md) for more info.


### Attributions:

A lot of the code is recycled from [pytorch](https://pytorch.org)


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-sublstm-0.0.2.tar.gz (5.9 kB view details)

Uploaded Source

Built Distribution

pytorch_sublstm-0.0.2-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-sublstm-0.0.2.tar.gz.

File metadata

File hashes

Hashes for pytorch-sublstm-0.0.2.tar.gz
Algorithm Hash digest
SHA256 cc87001bab328d50065b005e8ed7d972da68747718d1f803413ad5e4e1b0c970
MD5 dea087cdc44fe8b720f37d9e60ab015b
BLAKE2b-256 98723265439b8fa84f4ad96f22b7a2387113f3baced1794b41db5dab77e7b06f

See more details on using hashes here.

File details

Details for the file pytorch_sublstm-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for pytorch_sublstm-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d4aa45da82362f7907ab78f78211eca1101f8c7d88ccce63d4f47f2922e84c66
MD5 4a6608ceecc7eed496be3b634d7cda30
BLAKE2b-256 08a559a7ff047c5fdaa38f7cc3dd2769a13b1be22661ac1d1c26ff86977c5052

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page