autodiff engine inspired by tinygrad
Project description
SkinnyGrad
SkinnyGrad is a tensor autodifferentiation library that I wrote as a side project for fun and learning. By default, a computational graph is built and evaluated lazily with NumPy. GPU acceleration is also available with the CuPy backend extension. At ~1300 lines, skinnygrad is written with simplicity and extensibility in mind. It nevertheless covers a good subset of the features of a torch.Tensor
. Kudos to tinygrad which inspired the RISC-like design of mapping all operations to 19 low level ops that the runtime engine optimizes and executes.
Try it out!
pip install skinnygrad
import skinnygrad
a = tensors.Tensor(((1, 2, 3)))
b = tensors.Tensor(10)
x = tensors.Tensor(((4,), (5,), (6,)))
y = a @ x + b
print(y)
# <skinnygrad.tensors.Tensor(
# <skinnygrad.llops.Symbol(UNREALIZED <Op(ADD)>, shape=(1, 1))>,
# self.requires_grad=False,
# self.gradient=None,
# )>
print(y.realize())
# [[42]]
LeNet-5 as a convergence test
As an end-to-end test for the engine, I replicated the LeNet-5 paper -- a convolutional neural network (CNN) designed for handwritten digit recognition. Trained on MNIST, the model recovers 98% accuracy on the evaluation set after about 5 epochs. With a batch size of 64 it takes a few minutes per training epoch (60k images) using the CuPy GPU acceleration backend on a Nvidia A100 GPU. The code for the experiment can be found in the examples folder.
BONUS: The computational graph pass built up by the skinnygrad engine for LeNet-5
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for skinnygrad-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 34c44245475aca8f7421b4934811cf7e7042fa0599b0ecbaf77ad5f49bdd3c4d |
|
MD5 | 58c880a866f7f27dab94c5362c6c3e00 |
|
BLAKE2b-256 | be842777282fad61195d8e895e1f481b61f3b4683c294c410fa8c4fcec69b804 |