This package contains a Keras 3 implementation of the MinGRU layer, a minimal and parallelizable version of the gated recurrent unit (GRU).
Project description
MinGRU Implementation in Keras
This repository contains a Keras implementation of the MinGRU model, a minimal and parallelizable version of the traditional Gated Recurrent Unit (GRU) architecture. The MinGRU model is based on the research paper "Were RNNs All We Needed?" that revisits traditional recurrent neural networks and modifies them to be efficiently trained in parallel.
Features
- Minimal GRU architecture with significantly fewer parameters than traditional GRUs
- Fully parallelizable during training, achieving faster training times
- Compatible with Keras 3
Dependencies
This project uses uv to manage dependencies. To install the required dependencies, run:
uv install
Usage
To use the MinGRU model in your own project, simply import the MinGRU
class
and use it as you would any other Keras layer.
Example
import keras
from mingru_keras import MinGRU
layer = MinGRU(units=64)
b, t, d = 32, 1000, 8
X = keras.random.normal((b, t, d))
Y = layer(X)
Contributing
Contributions are welcome! If you'd like to report a bug or suggest a feature, please open an issue or submit a pull request.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mingru_keras-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7a9ab300b1f77d3a7fc79c6161df896b695c8a71b333e6f864dad9e9fd492ec5 |
|
MD5 | 20855dba44cc72c16bc6e625d9afb5ec |
|
BLAKE2b-256 | 7c03b087e66d6064ef75ef4c49f24dadb71badc6d969be502c1a7657368d46cd |