Skip to main content

You like pytorch? You like micrograd? You love tinygrad! heart

Project description


Unit Tests

For something in between a pytorch and a karpathy/micrograd

This may not be the best deep learning framework, but it is a deep learning framework.

The Tensor class is a wrapper around a numpy array, except it does Tensor things.

Installation

pip3 install tinygrad

Example

from tinygrad.tensor import Tensor

x = Tensor.eye(3)
y = Tensor([[2.0,0,-2.0]])
z = y.dot(x).sum()
z.backward()

print(x.grad)  # dz/dx
print(y.grad)  # dz/dy

Same example in torch

import torch

x = torch.eye(3, requires_grad=True)
y = torch.tensor([[2.0,0,-2.0]], requires_grad=True)
z = y.matmul(x).sum()
z.backward()

print(x.grad)  # dz/dx
print(y.grad)  # dz/dy

Neural networks?

It turns out, a decent autograd tensor library is 90% of what you need for neural networks. Add an optimizer (SGD, RMSprop, and Adam implemented) from tinygrad.optim, write some boilerplate minibatching code, and you have all you need.

Neural network example (from test/test_mnist.py)

from tinygrad.tensor import Tensor
import tinygrad.optim as optim
from tinygrad.utils import layer_init_uniform

class TinyBobNet:
  def __init__(self):
    self.l1 = Tensor(layer_init_uniform(784, 128))
    self.l2 = Tensor(layer_init_uniform(128, 10))

  def forward(self, x):
    return x.dot(self.l1).relu().dot(self.l2).logsoftmax()

model = TinyBobNet()
optim = optim.SGD([model.l1, model.l2], lr=0.001)

# ... and complete like pytorch, with (x,y) data

out = model.forward(x)
loss = out.mul(y).mean()
loss.backward()
optim.step()

The promise of small

tinygrad, with tests, will always be below 1000 lines. If it isn't, we will revert commits until tinygrad becomes smaller.

Running tests

python -m pytest

TODO

  • Reduce code
  • Increase speed
  • Add features
  • In that order

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tinygrad-0.2.2.tar.gz (10.4 kB view details)

Uploaded Source

Built Distribution

tinygrad-0.2.2-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file tinygrad-0.2.2.tar.gz.

File metadata

  • Download URL: tinygrad-0.2.2.tar.gz
  • Upload date:
  • Size: 10.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.0 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.5

File hashes

Hashes for tinygrad-0.2.2.tar.gz
Algorithm Hash digest
SHA256 941ee6a2c745d74e3e256172c78d810a07a688b9548f29b3c364e72b71525486
MD5 128625dcca3c7be9b4f3cc37ecc13c44
BLAKE2b-256 79b8de75d29f8643899ccd8c56b4e2fc75322907addb06fad3937f58bc15b829

See more details on using hashes here.

File details

Details for the file tinygrad-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: tinygrad-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.0 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.5

File hashes

Hashes for tinygrad-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ab846fb5c0a78622505b4dd18d42e8a611aae223e4734fb1fef042076cfe97ae
MD5 4f2a7024969e67318eb7c6890546966f
BLAKE2b-256 5a1e042d508281b5c62f2ddb73abf04d64c4a24777d90fbc820ac9c7e9ea16c0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page