Skip to main content

Pytorch implementation of Recurrent Memory Array Structures

Project description

ArrayLSTM

This code was implemented as part of the IEEE S&P DeepCASE [1] paper. We provide a Pytorch implementation of Recurrent Memory Array Structures by Kamil M Rocki. We ask people to cite both works when using the software for academic research papers.

Introduction

The following report introduces ideas augmenting standard Long Short Term Memory (LSTM) architecture with multiple memory cells per hidden unit in order to improve its generalization capabilities. It considers both deterministic and stochastic variants of memory operation. It is shown that the nondeterministic Array-LSTM approach improves state-of-the-art performance on character level text prediction achieving 1.402 BPC on enwik8 dataset. Furthermore, this report estabilishes baseline neural-based results of 1.12 BPC and 1.19 BPC for enwik9 and enwik10 datasets respectively.

Documentation

We provide an extensive documentation including installation instructions and reference at arraylstm.readthedocs.io

References

[1] van Ede, T., Aghakhani, H., Spahn, N., Bortolameotti, R., Cova, M., Continella, A., van Steen, M., Peter, A., Kruegel, C. & Vigna, G. (2022, May). DeepCASE: Semi-Supervised Contextual Analysis of Security Events. In 2022 Proceedings of the IEEE Symposium on Security and Privacy (S&P). IEEE.

[2] Rocki, K.M. (2016). Recurrent memory array structures. In arXiv preprint arXiv:1607.03085.

Bibtex

@inproceedings{vanede2020deepcase,
  title={{DeepCASE: Semi-Supervised Contextual Analysis of Security Events}},
  author={van Ede, Thijs and Aghakhani, Hojjat and Spahn, Noah and Bortolameotti, Riccardo and Cova, Marco and Continella, Andrea and van Steen, Maarten and Peter, Andreas and Kruegel, Christopher and Vigna, Giovanni},
  booktitle={Proceedings of the IEEE Symposium on Security and Privacy (S&P)},
  year={2022},
  organization={IEEE}
}
@article{rocki2016recurrent,
  title={Recurrent memory array structures},
  author={Rocki, Kamil},
  journal={arXiv preprint arXiv:1607.03085},
  year={2016}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

array-lstm-0.0.1.tar.gz (64.7 kB view details)

Uploaded Source

Built Distribution

array_lstm-0.0.1-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file array-lstm-0.0.1.tar.gz.

File metadata

  • Download URL: array-lstm-0.0.1.tar.gz
  • Upload date:
  • Size: 64.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.10

File hashes

Hashes for array-lstm-0.0.1.tar.gz
Algorithm Hash digest
SHA256 60a841c39833910688ecb8fe016b2480dd26d89cf973ebc19d32081f951a7937
MD5 23e8306a28bd412eb3cb20da5c50ac01
BLAKE2b-256 ed85dff429f33f34c041100e149f0ccdbcdfcecf96d554f1ceeecbb896e4c3dd

See more details on using hashes here.

File details

Details for the file array_lstm-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: array_lstm-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 9.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.10

File hashes

Hashes for array_lstm-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 191b4d0b14f605c1c02b0c62f8da63799450917a4320297213710de4926ed905
MD5 5c2a03493b15d2eb7dab9a9f1be2fa44
BLAKE2b-256 ff11c0156bb183a76223195fb096a37a61d6492e50eb17667dbaf368b80f2e88

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page