Skip to main content

Solves automatic numerical differentiation problems in one or more variables.

Project description

Suite of tools to solve automatic numerical differentiation problems in one or more variables. All of these methods also produce error estimates on the result. A pdf file is also provided to explain the theory behind these tools.

To test if the toolbox is working paste the following in an interactive python session:

import numdifftools as nd
nd.test(coverage=True, doctests=True)

Derivative:

A flexible tool for the computation of derivatives of order 1 through 4 on any scalar function. Finite differences are used in an adaptive manner, coupled with a Romberg extrapolation methodology to provide a maximally accurate result. The user can configure many of the options, changing the order of the method or the extrapolation, even allowing the user to specify whether central, forward or backward differences are used.

Gradient

Computes the gradient vector of a scalar function of one or more variables at any location.

Jacobian

Computes the Jacobian matrix of a vector (or array) valued function of one or more variables.

Hessian

Computes the Hessian matrix of all 2nd partial derivatives of a scalar function of one or more variables.

Hessdiag

The diagonal elements of the Hessian matrix are the pure second order partial derivatives.

Examples

Compute 1’st and 2’nd derivative of exp(x), at x == 1:

>>> import numpy as np
>>> import numdifftools as nd
>>> fd = nd.Derivative(np.exp)              # 1'st derivative
>>> fdd = nd.Derivative(np.exp, n=2)  # 2'nd derivative
>>> fd(1)
array([ 2.71828183])

Nonlinear least squares:

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nd.Jacobian(fun)
>>> np.abs(Jfun([1,2,0.75])) < 1e-14 # should be numerically zero
array([[ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True]], dtype=bool)

Compute gradient of sum(x**2):

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nd.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

See also

scipy.misc.derivative

Project details


Release history Release notifications

History Node

0.9.20

History Node

0.9.17

History Node

0.9.16

History Node

0.9.15

History Node

0.9.14

History Node

0.9.13

History Node

0.9.12

History Node

0.9.11

History Node

0.9.10

History Node

0.9.2

History Node

0.7.7

History Node

0.7.3

History Node

0.6.0

This version
History Node

0.5.0

History Node

0.4.0

History Node

0.3.5

History Node

0.3.4

History Node

0.3.3

History Node

0.3.1

History Node

0.2.1

History Node

0.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
Numdifftools-0.5.0.win32.exe (231.9 kB) Copy SHA256 hash SHA256 Windows Installer any Jan 10, 2014
Numdifftools-0.5.0.zip (185.6 kB) Copy SHA256 hash SHA256 Source None Jan 10, 2014

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging CloudAMQP CloudAMQP RabbitMQ AWS AWS Cloud computing Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page