Skip to main content

A sleek auto-differentiation library that wraps numpy.

Project description

mygrad is a simple, NumPy-centric autograd library. An autograd library enables you to automatically compute derivatives of mathematical functions. This library is designed to serve primarily as an education tool for learning about gradient-based machine learning; it is easy to install, has a readable and easily customizable code base, and provides a sleek interface that mimics NumPy. Furthermore, it leverages NumPy’s vectorization to achieve good performance despite the library’s simplicity.

This is not meant to be a competitor to libraries like PyTorch (which mygrad most closely resembles) or TensorFlow. Rather, it is meant to serve as a useful tool for students who are learning about training neural networks using back propagation.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for mygrad, version 1.9.0
Filename, size File type Python version Upload date Hashes
Filename, size mygrad-1.9.0-py3-none-any.whl (111.6 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size mygrad-1.9.0.tar.gz (99.1 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page