Skip to main content

Unofficial TensorFlow implementation of a Linear Recurrent Unit, proposed by Google Deepmind.

Project description

Linear Recurrent Units in Tensorflow: An Unofficial Implementation

This repository presents an unofficial implementation of Linear Recurrent Units (LRUs) proposed by Google DeepMind, utilizing Tensorflow. LRUs draw inspiration from Deep State-Space Machines, with a particular focus on S4 and S5 models.

Installation:

$ pip install LRU-tensorflow

Usage:

import tensorflow as tf
from LRU_tensorflow import LRU

lru = LRU(N=state_features, H=input_size) 
test_input = tf.random.uniform((batch_size, seq_length, input_size))  # Example Test Input
predictions = lru(test_input) # Get predictions

Paper:

Resurrecting Recurrent Neural Networks for Long Sequences

Notes:

  • If you require an implementation that supports 3-dimensional input sequences, you may want to refer to github.com/Gothos/LRU-pytorch. However, please be aware that this alternative implementation might be slower due to the absence of associative scans.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

LRU-tensorflow-0.1.1.tar.gz (3.9 kB view hashes)

Uploaded Source

Built Distribution

LRU_tensorflow-0.1.1-py3-none-any.whl (4.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page