Skip to main content

Solves automatic numerical differentiation problems in one or more variables.

Project description

https://badge.fury.io/py/numdifftools.png https://travis-ci.org/pbrod/numdifftools.svg?branch=master https://pypip.in/d/numdifftools/badge.png

Suite of tools written in Python to solve automatic numerical differentiation problems in one or more variables. Finite differences are used in an adaptive manner, coupled with a Romberg extrapolation methodology to provide a maximally accurate result. The user can configure many options like; changing the order of the method or the extrapolation, even allowing the user to specify whether central, forward or backward differences are used. The methods provided are:

Derivative: Compute the derivatives of order 1 through 4 on any scalar function.

Gradient: Compute the gradient vector of a scalar function of one or more variables.

Jacobian: Compute the Jacobian matrix of a vector valued function of one or more variables.

Hessian: Compute the Hessian matrix of all 2nd partial derivatives of a scalar function of one or more variables.

Hessdiag: Compute only the diagonal elements of the Hessian matrix

All of these methods also produce error estimates on the result. The documentation for these tools are given at http://numdifftools.readthedocs.org.

To test if the toolbox is working paste the following in an interactive python session:

import numdifftools as nd
nd.test(coverage=True, doctests=True)

Examples

Compute 1’st and 2’nd derivative of exp(x), at x == 1:

>>> import numpy as np
>>> import numdifftools as nd
>>> fd = nd.Derivative(np.exp)              # 1'st derivative
>>> fdd = nd.Derivative(np.exp, n=2)  # 2'nd derivative
>>> fd(1)
array([ 2.71828183])

Nonlinear least squares:

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nd.Jacobian(fun)
>>> np.abs(Jfun([1,2,0.75])) < 1e-14 # should be numerically zero
array([[ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True]], dtype=bool)

Compute gradient of sum(x**2):

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nd.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

See also

scipy.misc.derivative

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

numdifftools-0.7.7.zip (181.5 kB view details)

Uploaded Source

Built Distribution

numdifftools-0.7.7.win32.exe (223.1 kB view details)

Uploaded Source

File details

Details for the file numdifftools-0.7.7.zip.

File metadata

  • Download URL: numdifftools-0.7.7.zip
  • Upload date:
  • Size: 181.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for numdifftools-0.7.7.zip
Algorithm Hash digest
SHA256 c5a309a7a0b58f6177dc7e3d55c2830c07ca2715c9c706644d500fc4eca97c4d
MD5 07bce309b22fccee4af499c85518e90b
BLAKE2b-256 ea1e4006e7e8e6b5cae2d42657ab31de2474dca3daa6e06df17a36ae45932e52

See more details on using hashes here.

File details

Details for the file numdifftools-0.7.7.win32.exe.

File metadata

File hashes

Hashes for numdifftools-0.7.7.win32.exe
Algorithm Hash digest
SHA256 2a9fa00e33a3d5ff8576f017c5e7abafdc51278ad88861e1b9640c7ccf5e1d99
MD5 7e2fcc08f64aecea39a7026ba045a6e2
BLAKE2b-256 991b7e02c71baeb64f3c09252ef1ee2d939d76f7dee310d01ee416d06db5b32a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page