Skip to main content

tiny nn library that allows you to train classification neural networks

Project description

"# microtinygrad"

Micro Tiny neural network library that allows training of simple neural networks through mini-batch gradient descent only using numpy and pandas. Currently only provides support for pandas dataframes. Useful if you want to train small neural networks and quick tuning of hyperaparameters.

Motivation

In order to learn what goes on under the hood of neural network's backpropagation, I decided to implement it myself. However, my algorithm provides a more analytical solution (Calculating closed form gradients), rather than the approaches used by other neural network libaries (Micrograds Value Tree). Thus, this only currently supports common neural network patterns Linear Activation/Regression amd SoftMax Cross Entropy.

Simple Quick Start

nn = NeuralNetwork()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

microtinygrad-0.0.1.tar.gz (4.5 kB view hashes)

Uploaded Source

Built Distribution

microtinygrad-0.0.1-py3-none-any.whl (4.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page