Skip to main content

autodiff engine inspired by tinygrad

Project description

SkinnyGrad

pypi

SkinnyGrad is a tensor autodifferentiation library that I wrote as a side project for fun and learning. By default, a computational graph is built and evaluated lazily with NumPy. GPU acceleration is also available with the CuPy backend extension. At ~1300 lines, skinnygrad is written with simplicity and extensibility in mind. It nevertheless covers a good subset of the features of a torch.Tensor. Kudos to tinygrad, which inspired the RISC-like design of mapping all high level ops to 19 low level ops that the runtime engine optimizes and executes.

Try it out!

pip install skinnygrad
import skinnygrad

a = tensors.Tensor(((1, 2, 3)))
b = tensors.Tensor(10)
x = tensors.Tensor(((4,), (5,), (6,)))
y = a @ x + b
print(y)
# <skinnygrad.tensors.Tensor(
#   <skinnygrad.llops.Symbol(UNREALIZED <Op(ADD)>, shape=(1, 1))>,
#   self.requires_grad=False,
#   self.gradient=None,
# )>
print(y.realize())
# [[42]]

LeNet-5 as a convergence test

As an end-to-end test for the engine, I replicated the LeNet-5 paper and trained it for 5 epochs on MNIST, recovering the 98% eval accuracy. With a batch size of 64 it takes a few minutes per training epoch (60k images) using the CuPy GPU acceleration backend on a Nvidia A100 GPU. The code can be found in the examples folder.

BONUS: LeNet-5 forward pass built up by the skinnygrad engine

lenet-fwd

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

skinnygrad-0.1.0.tar.gz (16.2 kB view hashes)

Uploaded Source

Built Distribution

skinnygrad-0.1.0-py3-none-any.whl (18.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page