Skip to main content

Take it slow, compute gradients

Project description

slowgrad

Unit tests

A small neural network library optimized for learning.

Inspired by PyTorch, micrograd, and tinygrad.

Build an MNIST Convnet

from slowgrad.layers import Linear, Conv2d

class TinyConvNetLayer:
  def __init__(self):

    self.c1 = Conv2d(1,8,kernel_size=(3,3))
    self.c2 = Conv2d(8,16,kernel_size=(3,3))
    self.l1 = Linear(16*5*5,10)

  def parameters(self):
    return [*self.l1.get_parameters(), *self.c1.get_parameters(), *self.c2.get_parameters()]

  def forward(self, x):
    x = x.reshape(shape=(-1, 1, 28, 28))
    x = self.c1(x).relu().max_pool2d()
    x = self.c2(x).relu().max_pool2d()
    x = x.reshape(shape=[x.shape[0], -1])
    return self.l1(x).logsoftmax()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

slowgrad-0.1.1.tar.gz (6.1 kB view hashes)

Uploaded Source

Built Distribution

slowgrad-0.1.1-py3-none-any.whl (8.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page