An automatic differentiation package
Project description
cs207-FinalProject
autodiff
by
Group Name: DFYS
Group Number: 12
Group Member: Feiyu Chen, Yueting Luo, Yan Zhao
Introduction
Automatic differentiation (AD) is a family of techniques for efficiently and accurately evaluating derivatives of numeric functions expressed as computer programs. Application of AD includes Newton’s method for solving nonlinear equations, real-parameter optimization, probabilistic inference, and backpropagation in neural networks. AD has been extremely popular because of the booming development in machine learning and deep learning techniques. Our AD sofeware package enable user to calculate derivatives using the forward and reverse mode. Our package has feature including rooting finding, optimization(Newton, Gradient Descent, BFGS), and backpropagation.
Installing autodiff
Here is how to install autodiff
on command line. We suppose that the user has already installed pip
and virtualenv
:
Method 1 (using pip):
pip install DFYS-autodiff
Method 2 (directly from github repo):
- clone the project repo by
git clone git@github.com:D-F-Y-S/cs207-FinalProject.git
cd
into the local repo and create a virtual environment byvirtualenv env
- activate the virtual environment by
source env/bin/activate
(usedeactivate
to deactivate the virtual environment later.) - install the dependencies by
pip install -r requirements.txt
- install
autodiff
bypip install -e .
Getting Started
Here is the documentation page
See milestone2.ipynb
under docs/
.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.