Skip to main content

Bayesian Hessian Approximation for Stochastic Optimization

Project description

The BayHess package uses noisy curvature pairs (noisy gradient differences computed at different points) to compute Hessian approximations. These Hessian approximations can be used to accelerate the convergence in stochastic optimization in a quasi-Newton fashion. To find a Hessian approximation, a posterior distribution of the Hessian is built. The prior distribution is based on the Frobenius norm with determinant constraints to impose extreme eigenvalues constraints and the likelihood distribution is built from the secant equations given the observed curvature pairs. To find the maximizer of the log posterior, the BayHess package uses the Newton-CG method with a homotopy approach to deal with the logarithmic barrier determinant constraints.

For a detailed description of the method, convergence analysis and numerical results, check our manuscript named “Approximating Hessian matrices using Bayesian inference: a new approach for quasi-Newton methods in stochastic optimization”. This package can be used with the MICE estimator.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bayhess-0.1.4.tar.gz (24.4 kB view hashes)

Uploaded Source

Built Distribution

bayhess-0.1.4-py3-none-any.whl (24.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page