A collection of PyTorch implementations of neural network architectures and layers.
Project description
labml.ai Neural Networks
This is a collection of simple PyTorch implementations of neural networks and related algorithms. These implementations are documented with explanations,
The website renders these as side-by-side formatted notes. We believe these would help you understand these algorithms better.
We are actively maintaining this repo and adding new implementations almost weekly. for updates.
Modules
✨ Transformers
- Multi-headed attention
- Transformer building blocks
- Transformer XL
- Compressive Transformer
- GPT Architecture
- GLU Variants
- kNN-LM: Generalization through Memorization
- Feedback Transformer
- Switch Transformer
✨ Recurrent Highway Networks
✨ LSTM
✨ HyperNetworks - HyperLSTM
✨ Capsule Networks
✨ Generative Adversarial Networks
✨ Sketch RNN
✨ Reinforcement Learning
- Proximal Policy Optimization with Generalized Advantage Estimation
- Deep Q Networks with with Dueling Network, Prioritized Replay and Double Q Network.
✨ Optimizers
✨ Normalization Layers
Installation
pip install labml-nn
Citing LabML
If you use LabML for academic research, please cite the library using the following BibTeX entry.
@misc{labml,
author = {Varuna Jayasiri, Nipun Wijerathne},
title = {LabML: A library to organize machine learning experiments},
year = {2020},
url = {https://nn.labml.ai/},
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
labml-nn-0.4.91.tar.gz
(119.4 kB
view hashes)
Built Distribution
labml_nn-0.4.91-py3-none-any.whl
(171.1 kB
view hashes)
Close
Hashes for labml_nn-0.4.91-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d10ac63a1ba88339bcf97d6692c5023122bcc0d1e4fdcdb7edb8d5099b40286b |
|
MD5 | 758f06996cfa63c023d8782ff315e462 |
|
BLAKE2b-256 | e8e99fc12f6e1f407399f518ddf934410f0186578b6e2dc5dbc17b47910a88ad |