Skip to main content

A lightweight deep learning framework

Project description


nanograd

A lightweight deep learning framework.

DescriptionFeaturesTODOLicense

Description

After verification, nanograd is not a city in Russia...

However, it is a PyTorch-like lightweight deep learning framework. Use it to implement any DL algo you want with little boilerplate code.

Essentially, Nanograd is a continuously updated project. The goal is to implement as many features as possible while using as few abstraction layers as possible (only Numpy functions are allowed). Any contribution to the repo is welcome.

The library has a built-in auto-differentiation engine that dynamically builds a computational graph. The framework is built with basic features to train neural nets: basic ops, layers, weight initializers, optimizers and loss functions. Additional tools are developed to visualize your network: computational graph visualizers or activation map visualizers (SOON!).

The repo will be updated regularly with new features and examples.

Inspired from geohot's tinygrad.

Features

  • PyTorch-like autodifferentiation engine (dynamically constructed computational graph)
  • Weight initialization: Glorot uniform, Glorot normal, Kaiming uniform, Kaiming normal
  • Activations: ReLU, Sigmoid, tanh, Swish, ELU, LeakyReLU
  • Convolutions: Conv1d, Conv2d, MaxPool2d, AvgPool2d
  • Layers: Linear, BatchNorm1d, BatchNorm2d, Flatten, Dropout
  • Optimizers: SGD, Adam, AdamW
  • Loss: CrossEntropyLoss, Mean squared error
  • Computational graph visualizer (see example)

A quick side-by-side comparison between PyTorch and Nanograd for tensor computations

Basic tensor calculations

PyTorch

a = torch.empty((30, 30, 2))
         .normal_(mean=3, std=4)
b = torch.empty((30, 30, 1))
         .normal_(mean=10, std=2)

a.requires_grad = True
b.requires_grad = True

c = a + b
d = c.relu()
e = c.sigmoid()
f = d * e

f.sum().backward()

print(a.grad)
print(b.grad)

Nanograd

a = Tensor.normal(3, 4, (30, 30, 2), requires_grad=True)
b = Tensor.normal(10, 2, (30, 30, 1), requires_grad=True)

c = a + b
d = c.relu()
e = c.sigmoid()
f = d * e

f.backward()

print(a.grad)
print(b.grad)

Training a CNN on MNIST

# Model, loss & optim
model = CNN()
loss_function = CrossEntropyLoss()
optim = SGD(model.parameters(), lr=0.01, momentum=0)

# Training loop
BS = 128
losses, accuracies = [], []
STEPS = 1000

for i in tqdm(range(STEPS), total=STEPS):
  samp = np.random.randint(0, X_train.shape[0], size=(BS))
  X = tensor.Tensor(X_train[samp])
  Y = tensor.Tensor(Y_train[samp])

  optim.zero_grad()

  out = model(X)

  cat = out.data.argmax(1)
  accuracy = (cat == Y.data).mean()

  loss = loss_function(out, Y)
  loss.backward()

  optim.step()

  loss, accuracy = float(loss.data), float(accuracy)
  losses.append(loss)
  accuracies.append(accuracy)

Y_test_preds = model(tensor.Tensor(X_test)).data.argmax(1)
print((Y_test == Y_test_preds).mean())

Visualizing a computational graph

Visualizing a computational graph has never been that easy. Just call plot_forward and plot_backward.

f.plot_forward()

f.plot_backward()

TODO

  • Solve batchnorm issues
  • Add GRU, LSTM cells
  • Code example with EfficientNet-B0, CIFAR-10, MNIST
  • Code a transformer with Nanograd and train it on GPU

License

MIT


GitHub @PABannier  ·  Twitter @el_PA_B

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nanograd-1.0.4.tar.gz (29.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nanograd-1.0.4-py3-none-any.whl (33.8 kB view details)

Uploaded Python 3

File details

Details for the file nanograd-1.0.4.tar.gz.

File metadata

  • Download URL: nanograd-1.0.4.tar.gz
  • Upload date:
  • Size: 29.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.9.1

File hashes

Hashes for nanograd-1.0.4.tar.gz
Algorithm Hash digest
SHA256 a4f3e77a97265cd467c7d7631fe341b1d9a1e9d7844160e165a551aa01a5be7c
MD5 6a0ed613f69ef6f0182527cb22c5f193
BLAKE2b-256 d1ab5641603b7c931d4017dd361e52326933552a98aeaee0c6ccd446b869c598

See more details on using hashes here.

File details

Details for the file nanograd-1.0.4-py3-none-any.whl.

File metadata

  • Download URL: nanograd-1.0.4-py3-none-any.whl
  • Upload date:
  • Size: 33.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.9.1

File hashes

Hashes for nanograd-1.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 fb2e3813cc552fa2486b86a241b2beab10b1f1a38bb7186a88720c1bd4c62aaf
MD5 613b8c60120cfb663c7378be568eda0f
BLAKE2b-256 bcaad770926dae1b4e4790d2e7b6dbad9a5eec8d26c8a8ee5e5125444ab21f06

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page