A collection of PyTorch implementations of neural network architectures and layers.
Project description
LabML Neural Networks
This is a collection of simple PyTorch implementation of various neural network architectures and layers. We will keep adding to this.
Modules
✨ Transformers
Transformers module contains implementations for multi-headed attention and relative multi-headed attention.
✨ Recurrent Highway Networks
✨ LSTM
✨ Capsule Networks
✨ Generative Adversarial Networks
✨ Sketch RNN
✨ Reinforcement Learning
- Proximal Policy Optimization with Generalized Advantage Estimation
- Deep Q Networks with with Dueling Network, Prioritized Replay and Double Q Network.
Installation
pip install labml_nn
Citing LabML
If you use LabML for academic research, please cite the library using the following BibTeX entry.
@misc{labml,
author = {Varuna Jayasiri, Nipun Wijerathne},
title = {LabML: A library to organize machine learning experiments},
year = {2020},
url = {https://lab-ml.com/},
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
labml_nn-0.4.60.tar.gz
(115.6 kB
view hashes)
Built Distribution
labml_nn-0.4.60-py3-none-any.whl
(166.4 kB
view hashes)
Close
Hashes for labml_nn-0.4.60-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8eb6d5db74a225d1c0394e1418176a988ec4aa9c3a24f1f15219db1e6c705d2c |
|
MD5 | 8c4c2eade249f1b3aabdaafb70806067 |
|
BLAKE2b-256 | 68881b50fe13e8afa1bfdf731b48d611a2463e35481984882f321d4a22b8d1e1 |