Tensor-based autdiff engine and neural network API
Project description
mdgrad
A small autograd engine that implements backpropagation (reverse-mode autodiff). Heavily inspired by karpathy's micrograd, and extended to support operations on tensors instead of scalars. Includes a small neural network api for building and training neural networks. Has a PyTorch-like API.
Hopefully useful as an educational resource.
Installation
pip install mdgrad
Example Usage
A dumb example showing supported operations
import mdgrad
import mdgrad.nn as nn
a = 3 * mdgrad.randn(3, 2)
b = mdgrad.ones(shape=(2, 2))
c = a @ b
d = c * 3 / 2
e = d ** 2
f = e.sum()
print(f.data)
f.backward()
print(a.grad)
An example showing how to define and run a neural network. See the files in examples/
for more details on building and training models.
import mdgrad
import mdgrad.nn as nn
# Define the model and loss function
model = nn.Sequential(
nn.Linear(2, 20),
nn.ReLU(),
nn.Linear(20, 50),
nn.ReLU(),
nn.Linear(50, 15),
nn.ReLU(),
nn.Linear(15, 1),
nn.Sigmoid()
)
loss_fn = nn.MSELoss()
# Create dummy data
X = mdgrad.randn(100, 2)
target = mdgrad.randn(100, 1)
# Compute output and loss
out = model(X)
loss = loss_fn(out, target)
# Compute gradients of parameters
loss.backward()
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
mdgrad-0.3.tar.gz
(11.8 kB
view hashes)
Built Distribution
mdgrad-0.3-py3-none-any.whl
(12.0 kB
view hashes)