A sleek auto-differentiation library that wraps numpy.
mygrad is a simple, NumPy-centric autograd library. An autograd library enables you to automatically compute derivatives of mathematical functions. This library is designed to serve primarily as an education tool for learning about gradient-based machine learning; it is easy to install, has a readable and easily customizable code base, and provides a sleek interface that mimics NumPy. Furthermore, it leverages NumPy’s vectorization to achieve good performance despite the library’s simplicity.
This is not meant to be a competitor to libraries like PyTorch (which mygrad most closely resembles) or TensorFlow. Rather, it is meant to serve as a useful tool for students who are learning about training neural networks using back propagation.
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size mygrad-1.2.0-py3-none-any.whl (80.3 kB)||File type Wheel||Python version py3||Upload date||Hashes View hashes|
|Filename, size mygrad-1.2.0.tar.gz (80.5 kB)||File type Source||Python version None||Upload date||Hashes View hashes|