Skip to main content

Automated differentiation tool developed by ucxbw

Project description

Group #36 cs107-FinalProject

Authors:

  • Lanting Li
  • Jenny Dong
  • Jiaye Chen

Introduction

Automatic Differentiation (AD) is a powerful tool in optimization problem, such as root finding with Newton's method. It has been applied in science and engineering. We implement a python package called adxbw.

In an optimization problem, the core is to find out in which condition we can reach the local and global maxima or minima and the zero points. Compared to linear functions, it's computationally harder to find the roots of non-linear functions. Numeric and symbolic methods (and of course manually working and coding) failed under such a high complexity independently. Thus, AD, which integrates the advantages of numeric and symbolic differentiation, can be used to solve that problem.

How to use

To install:

pip install adxbw

Import

import adxbw

Univariate example:

# forward-mode imbedded in node.py
x = AD(math.e, 1)
f = x.log()  # Default base is e

Output:

The value and derivative of current function are 1.0 and 0.36787944117144233

As an additional feature, we have used the forward mode AD to compute the Jacobian in Newton's root finding method. We wrote a wrapper for Newton's method that works for single/multiple scalar inputs and single vector input.

For Newton's optimization with single scalar or vector input, we require the user to input an AD object with value ( first parameter to intialize AD object) to be numeric or a numpy array and partial derivative (second parameter to intialize AD object) to be also scalar numeric (CANNOT be array or list). Then they should input a string represents the function to optimize with the variable named by x_k (see third line code below). User should be careful with initial guess (value of the input AD object) because it plays an important role in convergence of Newton's method. The newton function also allow other optional parameter such as learning rate and number of max iteration. Please check code documention for details.

Newton's optimization with single scalar:

from adxbw.optimization import newton

x = AD(100, 1)  # 100 is the initial guess
funct = "- x_k**2 - 2*x_k"
temp_root = newton(funct, x)

Newton's optimization with single vector:

x = AD(np.array([-2, -5, -8]), 1)
funct = "- x_k**2 - 2*x_k"
temp_root = newton(funct, x)

See below Extension section for further Newton's method details.

Multivariate example:

Suppose you have a function $f(x, y) = x + {y}^2$, needed to evaluate at (1,2).

# numpy is imported as np in package, no need to re-import
# build input node: multivariate case x = 1, y = 2;
x = AD(1, np.array([1, 0]))
y = AD(2, np.array([0, 1]))

f = x + y ** 2
print(f.val)
print(f.partial_ders)
print(f)

Output:

5
[1 4]
The value and derivative of current function are 5 and [1 4]

Newton's optimization for multiple scalar variables:

x = AD(-1, np.array([1, 0]))
y = AD(2, np.array([0, 1]))
# All values of the initial nodes should be scalar (single float/int)
# Length of partial derivatives numpy array should correspond to number of variables
temp_dict = {"x1": x, "x2": y}
# Initialize a dictionary with keys to be customized names of variables

# Use keys of the dictionary to construct the representative function string
funct = "x1.sin() - x2.cos()"
# First two parameters should be representative function string 
# and dictionary of the initial nodes
temp_root = newton_multi(funct, temp_dict, lr=0.05)

Similar to single input newton's, users can set different learning rate, maximum iteration. See documentation in package for details. (help(adxbw.optimization.newton_multi))

Broader Impact and Inclusivity Statement

Automatic differentiation and its potential application in root finding are the corner stones in optimization problem. This means that our package could be used beyond the mathematical world and more broadly in science and engineering.

For example, as computational biologists, from cell signaling to gene networks, we could model cellular and molecular processes using nonlinear equations, and root finding using AD will enable machine-level accuracy. This implementation of automatic differentiation provides a convenient way for users to calculate and evaluate values and derivatives of the functions. People who would like to use the partial derivatives for higher level computation do not have to calculate the partial derivatives by hand and thus reduce a lot of unnecessary workload of the scientific researchers.

However, the potential implementation errors in our package may cause negative impact when users don't realize it. Notably, there's no peer review process for developers to upload their packages onto any platform, such as conda and PyPI. The underlying bugs within the package won't be easily found by users. The misuse of those packages will potentially lead to error-borne results in those research project. Furthermore, the conclusion drawn or implications made based on those erroneous results will presumably cause significant social impacts.

For example, the high school or college students who just get in touch with Calculus may misuse it that they rely heavily on the automatic differentiation tools to calculate derivatives. As a result, they may never learn about the methmatical mechanism of differentiation calculation and thus a misuse of our tool may lead to an educational failure.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adxbw-0.1.5.tar.gz (17.2 kB view details)

Uploaded Source

Built Distribution

adxbw-0.1.5-py3-none-any.whl (12.1 kB view details)

Uploaded Python 3

File details

Details for the file adxbw-0.1.5.tar.gz.

File metadata

  • Download URL: adxbw-0.1.5.tar.gz
  • Upload date:
  • Size: 17.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.24.0 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for adxbw-0.1.5.tar.gz
Algorithm Hash digest
SHA256 685d030a3b9e3d0a42c23aee2192ccfc4dad4ce04d230d42da5fe0d0d9496e10
MD5 86e6dc612f266cd86809b6052f10c2cc
BLAKE2b-256 51bab8d60ef20a0b1755ca92e5447a528313f9857ea99ee57139b9304367cbf4

See more details on using hashes here.

File details

Details for the file adxbw-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: adxbw-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 12.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.24.0 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for adxbw-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 8ec9b8fb847b5c449adca9c3a933f9d8a1fb893b5f462cb87c4df5f0c7eb0902
MD5 62f0aae002ddbd20752270b553606b7e
BLAKE2b-256 c2adff90751662ccfc3deb0bcfab8859687766339ae9b0094f01518c4eedce85

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page