Skip to main content

A walkthrough of a small engine for automatic differentiation

Project description

fauxgrad

Open In Colab


There's plenty of excellent (tinygrad) and minimalist (micrograd) built-from-scratch, deep learning frameworks out there, so the goal of fauxgrad is to sacrifice some of the full functionality, and focus on the general idea and building blocks for writing your own.

The walkthrough/tutorial can be found in this notebook.

Installation

pip install fauxgrad

Examples

Calculating some gradients:

a = Value(5)
b = Value(-3)
c = a * b
d = a + c
e = d * 2
e.backward()

print(f'The derivative that we computed before, de/da:', a.grad)
>>> -4.0

Plotting the backward pass graph:

from fauxgrad.utils import plot_graph
plot_graph(e)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fauxgrad-0.2.tar.gz (6.5 kB view hashes)

Uploaded Source

Built Distribution

fauxgrad-0.2-py3-none-any.whl (6.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page