Skip to main content

Regularised least squares objective function solver

Project description

This package provides trust region algorithms (TRA) for finding the minimum of some function. At the minute it contains only Levenberg-Marquart, but will be expanded to include NL2SOL and Powell's dogleg.

Levenberg-Marquardt

Example

An example is included within the package, simply call:

import TRA as TRA
def forward_model(x):
    y = np.array(x[0] ** 2 + x[1] ** 2)
    y = y.reshape((1, 1))
    return y

def compute_gradient(x):
    g = np.array(([2 * x[0]], [2 * x[1]]))
    g = g.reshape((2, 1))
    return g

def compute_hessian(x):
    h = np.array(([2, 0], [0, 2]))
    h = h.reshape((2, 2))
    return h


initial_prediction = np.array([5, 2.7])

LM_algorithm = TRA.Levenberg_Marquart(initial_prediction, compute_hessian, compute_gradient,
                                                  forward_model, d_param=1e-50,
                                                  lower_constraint=-np.inf,
                                                  upper_constraint=np.inf,
                                                  num_iterations=5)

minimum = LM_algorithm.optimisation_main()



image

This is a simple example, but shows how to use the Levenberg_Marquart class.

Function calls and arguments

There are a number of default values within the Levenberg_Marqaurdt class, including constraints on the solution, the number of iterations amd the damping parameter corresponding to the trust region. Three functions are required when instantiating a class object, one for computing the gradient, one for the Hessian and one for the mapping of the input to ouput (forward model).

:
def forward_model(x)
    :
    return f(x)
def compute_gradient(x):
    :
return grad

def compute_hessian(x):
    :
return hessian

initial_prediction = x0

LM_object = TRA.Levenberg_Marquart(initial_prediction, compute_hessian, compute_gradient,
                                                  forward_model, d_param=1e-50,
                                                  lower_constraint=-np.inf,
                                                  upper_constraint=np.inf,
                                                  num_iterations=5)



Theory

For the theory behind the code see [1] and [2].

References

[1] Jorge Nocedal and Stephen J. Wright (2006). Numerical Optimization.

[2] Andrew R. Conn, Nicholas I. M. Gould, and P.L. Toint (2000). Trust Region Methods.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

RLS - OF-1.7.tar.gz (3.5 kB view hashes)

Uploaded Source

Built Distribution

RLS_OF-1.7-py3-none-any.whl (4.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page