🧑🏫 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit), optimizers (adam, radam, adabelief), gans(dcgan, cyclegan, stylegan2), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, etc. 🧠
Project description
labml.ai Deep Learning Paper Implementations
This is a collection of simple PyTorch implementations of neural networks and related algorithms. These implementations are documented with explanations,
The website renders these as side-by-side formatted notes. We believe these would help you understand these algorithms better.
We are actively maintaining this repo and adding new implementations almost weekly. for updates.
Modules
✨ Transformers
- Multi-headed attention
- Transformer building blocks
- Transformer XL
- Compressive Transformer
- GPT Architecture
- GLU Variants
- kNN-LM: Generalization through Memorization
- Feedback Transformer
- Switch Transformer
- Fast Weights Transformer
- FNet
- Attention Free Transformer
- Masked Language Model
- MLP-Mixer: An all-MLP Architecture for Vision
- Pay Attention to MLPs (gMLP)
- Vision Transformer (ViT)
✨ Recurrent Highway Networks
✨ LSTM
✨ HyperNetworks - HyperLSTM
✨ ResNet
✨ Capsule Networks
✨ Generative Adversarial Networks
- Original GAN
- GAN with deep convolutional network
- Cycle GAN
- Wasserstein GAN
- Wasserstein GAN with Gradient Penalty
- StyleGAN 2
✨ Sketch RNN
✨ Graph Neural Networks
✨ Counterfactual Regret Minimization (CFR)
Solving games with incomplete information such as poker with CFR.
✨ Reinforcement Learning
- Proximal Policy Optimization with Generalized Advantage Estimation
- Deep Q Networks with with Dueling Network, Prioritized Replay and Double Q Network.
✨ Optimizers
✨ Normalization Layers
- Batch Normalization
- Layer Normalization
- Instance Normalization
- Group Normalization
- Weight Standardization
- Batch-Channel Normalization
✨ Distillation
✨ Adaptive Computation
✨ Uncertainty
Installation
pip install labml-nn
Citing
If you use this for academic research, please cite it using the following BibTeX entry.
@misc{labml,
author = {Varuna Jayasiri, Nipun Wijerathne},
title = {labml.ai Annotated Paper Implementations},
year = {2020},
url = {https://nn.labml.ai/},
}
Other Projects
🚀 Trending Research Papers
This shows the most popular research papers on social media. It also aggregates links to useful resources like paper explanations videos and discussions.
🧪 labml.ai/labml
This is a library that let's you onitor deep learning model training and hardware usage from your mobile phone. It also comes with a bunch of other tools to help write deep learning code efficiently.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for labml_nn-0.4.111-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2907fc84b6a8cd196f21ac5409cebad585139d32a2acddc877aed31c9b88e265 |
|
MD5 | 0020017d68a80dec86fad124bade15b0 |
|
BLAKE2b-256 | 941cb576f9b77a48b56ba2ca3e3b2a88d160e588cc30f8d33f39c2ac2c6b8c1c |