Skip to main content

This package contains a Keras 3 implementation of the MinGRU layer, a minimal and parallelizable version of the gated recurrent unit (GRU).

Project description

MinGRU Implementation in Keras

This repository contains a Keras implementation of the MinGRU model, a minimal and parallelizable version of the traditional Gated Recurrent Unit (GRU) architecture. The MinGRU model is based on the research paper "Were RNNs All We Needed?" that revisits traditional recurrent neural networks and modifies them to be efficiently trained in parallel.

Features

  • Minimal GRU architecture with significantly fewer parameters than traditional GRUs
  • Fully parallelizable during training, achieving faster training times
  • Compatible with Keras 3

Dependencies

This project uses uv to manage dependencies. To install the required dependencies, run:

uv install

Usage

To use the MinGRU model in your own project, simply import the MinGRU class and use it as you would any other Keras layer.

Example

import keras

from mingru_keras import MinGRU

layer = MinGRU(units=64)

b, t, d = 32, 1000, 8
X = keras.random.normal((b, t, d))
Y = layer(X)

Contributing

Contributions are welcome! If you'd like to report a bug or suggest a feature, please open an issue or submit a pull request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mingru_keras-0.1.0.tar.gz (3.8 kB view hashes)

Uploaded Source

Built Distribution

mingru_keras-0.1.0-py3-none-any.whl (3.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page