An autograd engine with a PyTorch-like neural network library on top.
Project description
quantagrad
An Autograd engine built for fun. Implements backpropagation and a small neural networks library on top of it with a PyTorch-like API. Potentially useful for educational purposes.
Installation
pip install quantagrad
Example usage 1
Below is an example showing how it can be used:
from quantagrad.engine import Value
node1 = Nodes(np.array([1.0,]))
node2 = Nodes(np.array([[2], [3]]))
k = node1 + node2
print(k.backward())
Example usage 2
from quantagrad.neural_net import Layer, Sequential
layer1 = Layer(3, 2)
# printing out the structure of layer1
print(f"----Structure of Layer1----\n{layer1}\n")
# To print weights of layer 1
print(f"----Weights of layer1----\n{layer1.w}\n")
layer2 = Layer(2, 1)
z = Sequential([layer1, layer2,])
print(f"----Structure of Sequential----\n{z}")
Training a neural net
"""How to set up a model for training"""
from quantagrad.module import module
from quantagrad.neural_net import Layer
from quantagrad.activations import ReLU
from quantagrad.loss_functions import CrossEntropyLoss
from quantagrad.optimizers import SGD
class digitNetwork(module):
def __init__(self):
self.fc1 = Layer(2, 60)
self.fc2 = Layer(60, 2)
self.relu = ReLU()
def forward(self, x):
x = self.fc1(x)
x = self.relu(x)
x = self.fc2(x)
return x
model = digitNetwork()
criterion = CrossEntropyLoss()
optim = SGD(model.parameters(), lr=0.01, alpha=0)
print(model)
The notebook demo.ipynb
provides a full demo of training a MLP classifier using crossentropy loss and stochastic gradient descent
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
quantagrad-0.1.0.tar.gz
(9.3 kB
view details)
Built Distribution
File details
Details for the file quantagrad-0.1.0.tar.gz
.
File metadata
- Download URL: quantagrad-0.1.0.tar.gz
- Upload date:
- Size: 9.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e5078a501f6bfeb3ba0aaaba6868dbd0c4e23eea16476c1bcd0578d258b8e635 |
|
MD5 | 8bdc583a00b7f9f1db454cc082b3bb70 |
|
BLAKE2b-256 | 78e9f6118e5894118446cc42ca563925b690ad5edf24a7f829fd341544477819 |
File details
Details for the file quantagrad-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: quantagrad-0.1.0-py3-none-any.whl
- Upload date:
- Size: 11.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d40da204377e45936de749c2313c7502d1eba7e6473fa7cae882315d38d350dc |
|
MD5 | 3157e7290ecb310c482c38277d2ac258 |
|
BLAKE2b-256 | b7e11505d32e204f85cb7382eec8a1fba9b11d025879615231b2c8a99f39a723 |