Skip to main content
Join the official 2019 Python Developers SurveyStart the survey!

A sleek auto-differentiation library that wraps numpy.

Project description

mygrad is a simple, NumPy-centric autograd library. An autograd library enables you to automatically compute derivatives of mathematical functions. This library is designed to serve primarily as an education tool for learning about gradient-based machine learning; it is easy to install, has a readable and easily customizable code base, and provides a sleek interface that mimics NumPy. Furthermore, it leverages NumPy’s vectorization to achieve good performance despite the library’s simplicity.

This is not meant to be a competitor to libraries like PyTorch (which mygrad most closely resembles) or TensorFlow. Rather, it is meant to serve as a useful tool for students who are learning about training neural networks using back propagation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for mygrad, version 1.2.0
Filename, size File type Python version Upload date Hashes
Filename, size mygrad-1.2.0-py3-none-any.whl (80.3 kB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size mygrad-1.2.0.tar.gz (80.5 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page