A sleek auto-differentiation library that wraps numpy.
Project description
mygrad is a simple, NumPy-centric autograd library. An autograd library enables you to automatically compute derivatives of mathematical functions. This library is designed to serve primarily as an education tool for learning about gradient-based machine learning; it is easy to install, has a readable and easily customizable code base, and provides a sleek interface that mimics NumPy. Furthermore, it leverages NumPy’s vectorization to achieve good performance despite the library’s simplicity.
This is not meant to be a competitor to libraries like PyTorch (which mygrad most closely resembles) or TensorFlow. Rather, it is meant to serve as a useful tool for students who are learning about training neural networks using back propagation.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mygrad-2.0.0.dev4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f86fb66683251e9366b6fbd868702adf678482bce5f63d7cc6b0aed9b830b430 |
|
MD5 | bcf48a4d528b088df42b2c45f1b5e75c |
|
BLAKE2b-256 | 4e44a124d5f881c9a88b5a7ef549f2429025189e1833bbfa783c64236495de51 |