Skip to main content

A Keras-like neural network library with an autograd engine operating on a dynamically built DAG of scalar values

Project description

KaiTorch

logo

KaiTorch is a deep learning library that dynamically builds a neural network as a decentralized acyclic graph (DAG) of Scalar values and implements backprop using reverse-mode autodiff. Heavily over-commented, highly impractical, but hopefully educational.

It implements a Keras-like API that allows you to build models using a Sequential class with Dense and Dropout layers, with implementations of several commonly used weight initializers, activation functions, optimizers, and loss functions.

This project was inspired by and is an extension of Andrej Karpathy's micrograd :)


Installation

pip install kaitorch

Tutorial Notebooks

  1. Functions and Gradients
    • Jupyter Notebook // Google Colab
    • keywords: functions, derivatives, gradients
  2. Functions as a Feed Forward Neural Net
    • Jupyter Notebook // Google Colab
    • keywords: directed acyclic graph, operator overloading, magic methods, recursion
  3. Reverse-mode Autodiff and Backpropogation
    • Jupyter Notebook // Google Colab
    • keywords: chain rule, reverse-mode autodiff, topological sort, backprop
  4. Activation Functions
    • Jupyter Notebook // Google Colab
    • keywords: sigmoid, tanh, ReLU, LeakyReLU, ELU, swish
  5. Multi-layer Perceptron and Weight Initialization
    • Jupyter Notebook // Google Colab
    • keywords: dense layer, multi-layer perceptron, weight initialization, sequential class
  6. Loss Functions
    • Jupyter Notebook // Google Colab
    • keywords: mean squared error, binary crossentropy, categorical crossentropy
  7. Gradient Descent and Optimizers (*personal favorite)
    • Jupyter Notebook // Google Colab
    • keywords: gradient descent, learning rate, momentum, adagrad, rmsprop, adam
  8. Inverted Dropout
    • Jupyter Notebook // Google Colab
    • keywords: dropout layer, inverted dropout, regularization

Example Notebooks

  1. Regression
    • Jupyter Notebook // Google Colab
  2. Binary Classification
    • Jupyter Notebook // Google Colab
  3. Multi-class Classification
    • Jupyter Notebook // Google Colab

Keras-esque API

Building a Neural Net

from kaitorch.models import Sequential
from kaitorch.layers import Dense, Dropout
from kaitorch.losses import CategoricalCrossentropy
from kaitorch.optimizers import Adam
from kaitorch.activations import LeakyReLU
from kaitorch.initializers import LecunNormal

model = Sequential()

model.add(Dense(12, activation='sigmoid', initializer='he_normal'))
model.add(Dropout(0.25))
model.add(Dense(12, activation=LeakyReLU(alpha=0.01), initializer=LecunNormal()))
model.add(Dense(3, activation='softmax'))

model.compile(
    optimizer=Adam(lr=0.025),
    loss=CategoricalCrossentropy()
)

Training a Neural Net

history = model.fit(X_train, y_train, epochs=32)

y_pred = model.predict(X_test)

Tracing/Visualization

model.plot_model(filename='trace')

logo

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

kaitorch-0.1.0-py3-none-any.whl (12.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page