Skip to main content

A PyTorch implementation of Cerebral LSTM: A Better Alternative for Single- and Multi-Stacked LSTM Cell-Based RNNs.

Project description

Cerebral LSTM - implementation in Pytorch

This repository provides python package for pytorch implementation of Cerebral LSTM, presented in the paper "Cerebral LSTM: A Better Alternative for Single- and Multi-Stacked LSTM Cell-Based RNNs". Research paper is published in SN Computer Science Springer Nature Journal.

Paper Title: Cerebral LSTM: A Better Alternative for Single- and Multi-Stacked LSTM Cell-Based RNNs

Author: Ravin Kumar

Publication: 14th March 2020

Published Paper: click here

Doi: DOI Link of Paper

Other Sources:

Github Repositories:

  • Github Repository (Python Package- Pytorch Implementation): Python Package
  • Github Repository (Sentiment Analysis LSTM vs Cerebral LSTM): ML Experiments

Cite Paper as:

Kumar, R. Cerebral LSTM: A Better Alternative for Single- and Multi-Stacked LSTM Cell-Based RNNs. SN COMPUT. SCI. 1, 85 (2020). https://doi.org/10.1007/s42979-020-0101-1

Cerebral LSTM Architecture:

Uf(t) = σ(Wuf ⋅ [h(t − 1), x(t)] + buf)
Ui(t) = σ(Wui ⋅ [h(t − 1), x(t)] + bui)
UCtmp(t) = tanh (Wuc ⋅ [h(t − 1), x(t)] + buc)
UC(t) = Uf(t) ∗ UC(t − 1) + Ui(t) ∗ UCtmp(t)
Uo(t) = σ(Wuo ⋅ [h(t − 1), x(t)] + buo)
Lf(t) = σ(Wlf ⋅ [h(t − 1), x(t)] + blf)
Li(t) = σ(Wli ⋅ [h(t − 1), x(t)] + bli)
LCtmp(t) = tanh (Wlc ⋅ [h(t − 1), x(t)] + blc)
LC(t) = Lf(t) ∗ LC(t − 1) + Li(t) ∗ LCtmp(t)
Lo(t) = σ(Wlo ⋅ [h(t − 1), x(t)] + blo)
h(t) = Uo(t) ∗ tanh(UC(t)) + Lo(t) ∗ tanh(LC(t))

Python Package: Pytorch Implementation

📥 Installation
pip install cerebral_lstm

Or,

pip install git+https://github.com/mr-ravin/cerebral_lstm.git
🚀 Usage
import torch
from cerebral_lstm import CerebralLSTM

# Create a Cerebral LSTM model, i.e. RNN  model with Cerebral LSTM cell unit
model = CerebralLSTM(input_size=64, hidden_size=128, num_layers=2, use_xavier=True, dropout=0.5) # Default: use_xavier=True

# Input: (seq_len, batch_size, input_size)
x = torch.randn(10, 32, 64)  # Example input
output, hidden = model(x)
print(output.shape)  # (10, 32, 128)
How to access only Cerebral LSTM cell unit ?
from cerebral_lstm import CerebralLSTMCell

# Get only a single Cerebral LSTM cell unit
lstm_cell_unit = CerebralLSTMCell(input_size=64, hidden_size=128, use_xavier=True) # Default: use_xavier=True

Impact of Initialisation of Trainable Parameters in Cerebral LSTM

The initial value of trainable parameters of upper and lower parts have impact onnumber of epochs required to train Cerebral LSTM cell. Ideally, upper and lowerparts should not have same initial values for their trainable parameters.

Identical initial trainable parameter values for upper and lower parts ❌

Initial Symmetry: Upper and lower parts of the Cerebral LSTM process inputs identically, leading to similar cell states Uc(t) and Lc(t).

Redundancy: Initial representations of upper and lower parts are redundant, potentially under-utilizing the model’s capacity.

Gradients: Early training updates are similar, but divergence may occur over time,leading to different feature extraction.

Different initial trainable parameter values for upper and lower parts ✔️

Diverse Learning: Upper and lower parts of Cerebral LSTM immediately capture different aspects of the data, enhancing representation diversity.

Specialization: Faster convergence and better utilization of the dual-path architec-ture, as each path can specialize in different features.

Performance: Improved performance due to richer, non-redundant representationsfrom the start.


Experimentation Repository in https://github.com/mr-ravin/cerebral-rnn-experimental-results

  • Comparative Study - Cerebral LSTM vs LSTM:

    Pytorch Implementation of Cerebral LSTM is available in Cerebral_LSTM/Cerebral_LSTM_Implementation_in_Pytorch.ipynb file.

  • Comparative Study Cerebral LSTM vs Stacked-LSTM vs LSTM (Logs only)

    For the training loss graphs present in the research paper, see the below structure:

    |
    |-data/                             # This directory contains dataset used for comparison.
    |
    |-loss_values/                      # This directory contains record of training loss for each model to perform comparative analysis.
          |
          |- 2stack_lstm.txt 
          |- proposed_model.txt
          |- single_lstm.txt
    

Conclusion

Our proposed recurrent cell ‘Cerebral LSTM’ showed the ability to better understand data and has easily outperformed both single LSTM and two-stacked LSTM based recurrent neural networks. Many variants of Cerebral LSTM can be designed using available varieties of LSTM cells such as peephole LSTM. Further research work can be conducted on designing Cerebral LSTM based stacked recurrent neural networks for designing deep learning architectures for understanding time-series data. Other recurrent cells including gated recurrent units can also be analyzed after modifying itsinternal connections similar to our cerebral structure.


Copyright License

Copyright (c) 2025 Ravin Kumar
Website: https://mr-ravin.github.io

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation 
files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, 
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the 
Software is furnished to do so, subject to the following conditions:

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE 
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR 
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, 
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cerebral_lstm-1.1.1.tar.gz (6.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cerebral_lstm-1.1.1-py3-none-any.whl (5.8 kB view details)

Uploaded Python 3

File details

Details for the file cerebral_lstm-1.1.1.tar.gz.

File metadata

  • Download URL: cerebral_lstm-1.1.1.tar.gz
  • Upload date:
  • Size: 6.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.4

File hashes

Hashes for cerebral_lstm-1.1.1.tar.gz
Algorithm Hash digest
SHA256 7fb639e213e439dba119d4165bb98da9e59021b980a4b807fd14b44345ff10ad
MD5 2299e2a1d1b731b6ea638408d7244968
BLAKE2b-256 b3704cde4e6399181df4207b408468dd7874df521c20d63b7dca0433deee7915

See more details on using hashes here.

File details

Details for the file cerebral_lstm-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: cerebral_lstm-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 5.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.4

File hashes

Hashes for cerebral_lstm-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f08ea40d6d39f27f171ca1c6cb571e4b182241b2da8a8540bd6f00cf0754c6a3
MD5 43c0a26ce7b0ef9d8235627afc23cd84
BLAKE2b-256 1581bcd70d5633e7a6d2b60b7fe4c889fb03e274525dfa0ded9651edfc439261

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page