Skip to main content

An automatic differentiation package

Project description


autodiff by

Group Name: DFYS

Group Number: 12

Group Member: Feiyu Chen, Yueting Luo, Yan Zhao

Build Status

Coverage Status

Doc Status


Automatic differentiation (AD) is a family of techniques for efficiently and accurately evaluating derivatives of numeric functions expressed as computer programs. Application of AD includes Newton’s method for solving nonlinear equations, real-parameter optimization, probabilistic inference, and backpropagation in neural networks. AD has been extremely popular because of the booming development in machine learning and deep learning techniques. Our AD sofeware package enable user to calculate derivatives using the forward and reverse mode. Our package has feature including rooting finding, optimization(Newton, Gradient Descent, BFGS), and backpropagation.

Installing autodiff

Here is how to install autodiff on command line. We suppose that the user has already installed pip and virtualenv:

Method 1 (using pip):

pip install DFYS-autodiff

Method 2 (directly from github repo):

  1. clone the project repo by git clone
  2. cd into the local repo and create a virtual environment by virtualenv env
  3. activate the virtual environment by source env/bin/activate (use deactivate to deactivate the virtual environment later.)
  4. install the dependencies by pip install -r requirements.txt
  5. install autodiff by pip install -e .

Getting Started

Here is the documentation page

See milestone2.ipynb under docs/.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

DFYS_autodiff-0.0.3.tar.gz (10.1 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page