A Keras-like neural network library with an autograd engine operating on a dynamically built DAG of scalar values
Project description
KaiTorch
KaiTorch is a deep learning library that dynamically builds a neural network as a decentralized acyclic graph (DAG) of Scalar
values and implements backprop using reverse-mode autodiff. Heavily over-commented, highly impractical, but hopefully educational.
It implements a Keras-like API that allows you to build models using a Sequential
class with Dense and Dropout layers, with implementations of several commonly used weight initializers, activation functions, optimizers, and loss functions.
This project was inspired by and is an extension of Andrej Karpathy's micrograd :)
Installation
pip install kaitorch
Tutorial Notebooks
- Functions and Gradients
- Jupyter Notebook // Google Colab
- keywords: functions, derivatives, gradients
- Functions as a Feed Forward Neural Net
- Jupyter Notebook // Google Colab
- keywords: directed acyclic graph, operator overloading, magic methods, recursion
- Reverse-mode Autodiff and Backpropogation
- Jupyter Notebook // Google Colab
- keywords: chain rule, reverse-mode autodiff, topological sort, backprop
- Activation Functions
- Jupyter Notebook // Google Colab
- keywords: sigmoid, tanh, ReLU, LeakyReLU, ELU, swish
- Multi-layer Perceptron and Weight Initialization
- Jupyter Notebook // Google Colab
- keywords: dense layer, multi-layer perceptron, weight initialization, sequential class
- Loss Functions
- Jupyter Notebook // Google Colab
- keywords: mean squared error, binary crossentropy, categorical crossentropy
- Gradient Descent and Optimizers (*personal favorite)
- Jupyter Notebook // Google Colab
- keywords: gradient descent, learning rate, momentum, adagrad, rmsprop, adam
- Inverted Dropout
- Jupyter Notebook // Google Colab
- keywords: dropout layer, inverted dropout, regularization
Example Notebooks
- Regression
- Jupyter Notebook // Google Colab
- Binary Classification
- Jupyter Notebook // Google Colab
- Multi-class Classification
- Jupyter Notebook // Google Colab
Keras-esque API
Building a Neural Net
from kaitorch.models import Sequential
from kaitorch.layers import Dense, Dropout
from kaitorch.losses import CategoricalCrossentropy
from kaitorch.optimizers import Adam
from kaitorch.activations import LeakyReLU
from kaitorch.initializers import LecunNormal
model = Sequential()
model.add(Dense(12, activation='sigmoid', initializer='he_normal'))
model.add(Dropout(0.25))
model.add(Dense(12, activation=LeakyReLU(alpha=0.01), initializer=LecunNormal()))
model.add(Dense(3, activation='softmax'))
model.compile(
optimizer=Adam(lr=0.025),
loss=CategoricalCrossentropy()
)
Training a Neural Net
history = model.fit(X_train, y_train, epochs=32)
y_pred = model.predict(X_test)
Tracing/Visualization
model.plot_model(filename='trace')
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.