An autocgrad engine that is between micrograd and tinygrad with a PyTorch-like neural network API:)
Project description
atomgrad
Atomgrad is a simple autograd engine that aims to be between micrograd and tinygrad that performs autodiff on vector-valued and scalar-valued tensors (atoms) coupled with a neural network api library.
Features
- Supports Pytorch-like vector-valued and scalar-valued tensors.
- Supports basic unary ops, binary ops, reduce ops and movement ops i.e (activn funcs,
sum
,exp
,reshape
, etc). - Supports activation functions such as
relu
,sigmoid
,tanh
, etc. - Supports softmax and binary cross entropy.
- Supports Graph Viz.
Installation
You can install atomgrad using pip:
pip install atomgrad==0.2.5
Usage
Here is a simple example of using atomgrad to compute the gradient of a function:
from atomgrad.atom import Atom
from atomgrad.graph import draw_dot
# create two tensors with gradients enabled
x = Atom(2.0, requires_grad=True)
y = Atom(3.0, requires_grad=True)
# define a function
z = x * y + x ** 2
# compute the backward pass
z.backward()
# print the gradients
print(x.grad) # 7.0
print(y.grad) # 2.0
draw_dot(z)
Here is a simple example of using atomgrad to train a neural network:
import numpy as np
from atomgrad.atom import Atom
from atomgrad.nn import AtomNet, Layer
from atomgrad.optim import SGD
from atomgrad.metrics import binary_cross_entropy, binary_accuracy
# create a model
model = AtomNet(
Layer(2, 16),
Layer(16, 16),
Layer(16, 1)
)
# create an optimizer
optim = SGD(model.parameters(), lr=0.01)
# load some data
x = [[2.0, 3.0, -1.0],
[3.0, -1.0, 0.5],
[0.5, 1.0, 1.0],
[1.0, 1.0, -1.0],
[0.0, 4.0, 0.5],
[3.0, -1.0, 0.5]]
y = [1, 1, 0, 1, 0, 1]
x = Atom(x)
y = Atom(y)
model.fit(x, y, optim, binary_cross_entropy, binary_accuracy, epochs=100)
Demos
An example of simple autodiff and two binary classifiers using a neural net with a 16 node hidden layer network is in the demos.ipynb
notebook.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file atomgrad-0.2.8.tar.gz
.
File metadata
- Download URL: atomgrad-0.2.8.tar.gz
- Upload date:
- Size: 6.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.12.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
d81238c905b2215ab102f80133727548bc05712d7029f56e699ea34644e961ef
|
|
MD5 |
1ac7c3e049352f32221f2948b73e5a42
|
|
BLAKE2b-256 |
8d273b042db1500dff046b79534cb74b93c03c1670048e56878e1198e46601c0
|
File details
Details for the file atomgrad-0.2.8-py3-none-any.whl
.
File metadata
- Download URL: atomgrad-0.2.8-py3-none-any.whl
- Upload date:
- Size: 6.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.12.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
571478d2a14f3c6fdaec0b6999a8961a7d277b257a7e3c8b5ba9a9f01cd5d715
|
|
MD5 |
22f823316fb36e85ec6080ed0bfd6731
|
|
BLAKE2b-256 |
701b8364ec631dbbfd358fb73eb205d5a97bae4b472e85dc39321e3be9cfd8d6
|