Skip to main content

An autocgrad engine that is between micrograd and tinygrad with a PyTorch-like neural network API:)

Project description

atomgrad

pic

Atomgrad is a simple autograd engine that aims to be between micrograd and tinygrad that performs autodiff on vector-valued and scalar-valued tensors (atoms) coupled with a neural network api library.

Features

  • Supports Pytorch-like vector-valued and scalar-valued tensors.
  • Supports basic unary ops, binary ops, reduce ops and movement ops i.e (activn funcs, sum, exp, reshape, randint, uniform, etc).
  • Supports activation functions such as relu, sigmoid, tanh, etc.
  • Supports binary_cross_entropy & binary_accuracy.
  • Supports Graph Visualizations.

Installation

You can install atomgrad using pip:

pip install atomgrad==0.3.0

Usage

Here is a simple example of using atomgrad to compute the gradient of a function:

from atomgrad.atom import Atom
from atomgrad.graph import draw_dot


# create two tensors with gradients enabled
x = Atom(2.0, requires_grad=True)
y = Atom(3.0, requires_grad=True)

# define a function
z = x * y + x ** 2

# compute the backward pass
z.backward()

# print the gradients
print(x.grad) # 7.0
print(y.grad) # 2.0

draw_dot(z)

pic

Here is a simple example of using atomgrad to train a one 16-node hidden layer neural network for binary classification.

import numpy as np
from atomgrad.atom import Atom
from atomgrad.nn import AtomNet, Layer
from atomgrad.optim import SGD
from atomgrad.metrics import binary_cross_entropy, binary_accuracy

# create a model
model = AtomNet(
  Layer(2, 16),
  Layer(16, 16),
  Layer(16, 1)
)
# create an optimizer
optim = SGD(model.parameters(), lr=0.01)

# load some data
x = [[2.0, 3.0, -1.0],
  [3.0, -1.0, 0.5],
  [0.5, 1.0, 1.0],
  [1.0, 1.0, -1.0],
  [0.0, 4.0, 0.5],
  [3.0, -1.0, 0.5]]
y = [1, 1, 0, 1, 0, 1]

x = Atom(x)
y = Atom(y)

model.fit(x, y, optim, binary_cross_entropy, binary_accuracy, epochs=100)

#output
'''
...
epoch: 30 | loss: 0.14601783454418182  | accuracy: 100.0%
epoch: 35 | loss: 0.11600304394960403  | accuracy: 100.0%
epoch: 40 | loss: 0.09604986757040024  | accuracy: 100.0%
epoch: 45 | loss: 0.0816292017698288  | accuracy: 100.0%
'''

Demos

An example of simple autodiff and four binary classifiers including make_moons dataset and MNIST digits dataset is in the examples/demos.ipynb notebook.

Note: Although atom.nn includes softmax activation and cat_cross_entropy, model results are quite dissapointing and are likely due to some bug (plz lmk if you find it!). As a result the AtomNet model is best suited for binary classification neural net tasks.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

atomgrad-0.3.0.tar.gz (6.9 kB view details)

Uploaded Source

Built Distribution

atomgrad-0.3.0-py3-none-any.whl (6.6 kB view details)

Uploaded Python 3

File details

Details for the file atomgrad-0.3.0.tar.gz.

File metadata

  • Download URL: atomgrad-0.3.0.tar.gz
  • Upload date:
  • Size: 6.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for atomgrad-0.3.0.tar.gz
Algorithm Hash digest
SHA256 83ea0f4e17ec3462eb1a8a85bd33cbb936ca38d93ffe12bbcbbadc8576d3476f
MD5 14afb13b0bcfbcf710a13d596e55afa1
BLAKE2b-256 3d79b1572155c51c7f4628455a63edecdf3b4c7677b0378651b287524f283487

See more details on using hashes here.

File details

Details for the file atomgrad-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: atomgrad-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 6.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for atomgrad-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 88b8726a55f8a74c11d1a2f41d8edcdcc7a800abd1197321147bc179c0b44250
MD5 53eaddaef77243f36da5ee2e06efa78a
BLAKE2b-256 ef77d9ae363b2f714baa01dd281f1e58297a753188735b86eaee15a8c5a7d9aa

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page