Skip to main content

You like pytorch? You like micrograd? You love tinygrad! <3

Project description

logo

tinygrad: For something between PyTorch and karpathy/micrograd. Maintained by tiny corp.

Homepage | Documentation | Examples | Showcase | Discord

GitHub Repo stars Unit Tests Discord Lines of code


This may not be the best deep learning framework, but it is a deep learning framework.

Due to its extreme simplicity, it aims to be the easiest framework to add new accelerators to, with support for both inference and training. If XLA is CISC, tinygrad is RISC.

tinygrad is still alpha software, but we raised some money to make it good. Someday, we will tape out chips.

Features

LLaMA and Stable Diffusion

tinygrad can run LLaMA and Stable Diffusion!

Laziness

Try a matmul. See how, despite the style, it is fused into one kernel with the power of laziness.

DEBUG=3 python3 -c "from tinygrad.tensor import Tensor;
N = 1024; a, b = Tensor.rand(N, N), Tensor.rand(N, N);
c = (a.reshape(N, 1, N) * b.permute(1,0).reshape(1, N, N)).sum(axis=2);
print((c.numpy() - (a.numpy() @ b.numpy())).mean())"

And we can change DEBUG to 4 to see the generated code.

Neural networks

As it turns out, 90% of what you need for neural networks are a decent autograd/tensor library. Throw in an optimizer, a data loader, and some compute, and you have all you need.

Neural network example (from test/models/test_mnist.py)

from tinygrad.tensor import Tensor
import tinygrad.nn.optim as optim

class TinyBobNet:
  def __init__(self):
    self.l1 = Tensor.uniform(784, 128)
    self.l2 = Tensor.uniform(128, 10)

  def forward(self, x):
    return x.dot(self.l1).relu().dot(self.l2).log_softmax()

model = TinyBobNet()
optim = optim.SGD([model.l1, model.l2], lr=0.001)

# ... complete data loader here

out = model.forward(x)
loss = out.mul(y).mean()
optim.zero_grad()
loss.backward()
optim.step()

Accelerators

tinygrad already supports numerous accelerators, including:

And it is easy to add more! Your accelerator of choice only needs to support a total of 26 (optionally 27) low level ops. More information can be found in the documentation for adding new accelerators.

Installation

The current recommended way to install tinygrad is from source.

From source

git clone https://github.com/tinygrad/tinygrad.git
cd tinygrad
python3 -m pip install -e .

Don't forget the . at the end!

Documentation

Documentation along with a quick start guide can be found in the docs/ directory.

Quick example comparing to PyTorch

from tinygrad.tensor import Tensor

x = Tensor.eye(3, requires_grad=True)
y = Tensor([[2.0,0,-2.0]], requires_grad=True)
z = y.matmul(x).sum()
z.backward()

print(x.grad.numpy())  # dz/dx
print(y.grad.numpy())  # dz/dy

The same thing but in PyTorch:

import torch

x = torch.eye(3, requires_grad=True)
y = torch.tensor([[2.0,0,-2.0]], requires_grad=True)
z = y.matmul(x).sum()
z.backward()

print(x.grad.numpy())  # dz/dx
print(y.grad.numpy())  # dz/dy

Contributing

There has been a lot of interest in tinygrad lately. Here are some basic guidelines for contributing:

  • Bug fixes are the best and always welcome! Like this one.
  • If you don't understand the code you are changing, don't change it!
  • All code golf PRs will be closed, but conceptual cleanups are great.
  • Features are welcome. Though if you are adding a feature, you need to include tests.
  • Improving test coverage is great, with reliable non-brittle tests.

Additional guidelines can be found in CONTRIBUTING.md.

Running tests

For more examples on how to run the full test suite please refer to the CI workflow.

Some examples:

python3 -m pip install -e '.[testing]'
python3 -m pytest
python3 -m pytest -v -k TestTrain
python3 ./test/models/test_train.py TestTrain.test_efficientnet

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tinygrad-0.7.0.tar.gz (126.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tinygrad-0.7.0-py3-none-any.whl (107.7 kB view details)

Uploaded Python 3

File details

Details for the file tinygrad-0.7.0.tar.gz.

File metadata

  • Download URL: tinygrad-0.7.0.tar.gz
  • Upload date:
  • Size: 126.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for tinygrad-0.7.0.tar.gz
Algorithm Hash digest
SHA256 9a81ee46be716b021006146b2482ae6734e3e35eccfb0c41e334f08da3f31a5a
MD5 361f5b6a9b4f7cb2ed991480316a8eb7
BLAKE2b-256 435aea743541f51026d62dd269eb30a7ce3e61335d3cc485d998eb9b2d75f9fb

See more details on using hashes here.

File details

Details for the file tinygrad-0.7.0-py3-none-any.whl.

File metadata

  • Download URL: tinygrad-0.7.0-py3-none-any.whl
  • Upload date:
  • Size: 107.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for tinygrad-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d5987dd0a21fd4168c7a67d51dfb8d0ab90ca50da86a2e2cc1ee3cb1b0f3f8cc
MD5 9bba33035f9f0c063ed31d9ceb3330a4
BLAKE2b-256 1041d5ca6119dd4c8dfc426a1da27e744d3146b2e9df80688eb811b58ace0110

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page