tiny nn library that allows you to train classification neural networks
Project description
"# microtinygrad"
Micro Tiny neural network library that allows training of simple neural networks through mini-batch gradient descent only using numpy and pandas. Currently only provides support for pandas dataframes. Useful if you want to train small neural networks and quick tuning of hyperaparameters.
Motivation
In order to learn what goes on under the hood of neural network's backpropagation, I decided to implement it myself. However, my algorithm provides a more analytical solution (Calculating closed form gradients), rather than the approaches used by other neural network libaries (Micrograds Value Tree). Thus, this only currently supports common neural network patterns Linear Activation/Regression amd SoftMax Cross Entropy.
Simple Quick Start
nn = NeuralNetwork()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for microtinygrad-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bcd19908d6df2feab8366db68644f3b0c0a430b8acb4ef90d0d7a2bff12969a2 |
|
MD5 | 32c35839923256281d5ac1789e699b54 |
|
BLAKE2b-256 | 0fa42c4d51a8243e2cc40dc19c0e5255a72fea13ae5969e9063864f047ce479e |