Skip to main content

Nonlinear least-squares optimization for AxSI

Project description

Usage Guide

Overview

This section describes how to use the Nonlinear Least-Squares Optimization program.

Run from Command Line

To execute the program via the command line, use the following syntax:

python nonlinear_least_squares.py \
  --x0 1.0 2.0 \
  --bounds -10.0 10.0 -5.0 5.0 \
  --ftol 1e-6 \
  --xtol 1e-6 \
  --diff_step 1e-3 \
  --max_nfev 10000

Required Arguments:

  • --x0: Initial guess for the parameters (space-separated values).
  • --bounds: Parameter bounds as pairs (lower and upper limits for each parameter).

Optional Arguments:

  • --ftol (default: 1e-6): Tolerance for changes in the cost function value.
  • --xtol (default: 1e-6): Tolerance for updates to the parameter values.
  • --diff_step (default: 1e-3): Step size for finite-difference approximation.
  • --max_nfev (default: 10000): Maximum number of function evaluations allowed.

Function Import and Customization

The program also supports direct import of the nonlinear_least_squares function into Python scripts. This allows advanced customization, such as using custom residual (reg_func) and Jacobian (jac) functions.

Example:

from nonlinear_least_squares import nonlinear_least_squares

def custom_reg_func(x, *args):
    # Define custom residual computation
    pass

def custom_jacobian(x, *args):
    # Define custom Jacobian computation
    pass

result = nonlinear_least_squares(
    reg_func=custom_reg_func,
    x0=[1.0, 2.0],
    bounds=([-10.0, -5.0], [10.0, 5.0]),
    jac=custom_jacobian
)

print(result)

Output Description

The program provides a summary of the optimization process, including:

Key Results:

  • Success: Indicates whether the optimization converged successfully.
  • Optimized Parameters: The final parameter values at the solution.
  • Residuals: Values of the residual function at the solution.
  • Jacobian: Jacobian matrix at the solution.
  • Exit Flag: Integer code indicating the reason for termination.

Detailed Metadata:

The output includes a dictionary with the following fields:

  • algorithm: Solver used for optimization.
  • firstorderopt: Measure of first-order optimality.
  • iterations: Number of iterations performed.
  • funcCount: Number of function evaluations.
  • cgiterations: Number of conjugate gradient iterations.
  • Message: Descriptive message about the termination reason.

Python version

This project is currently using Python 3.12

Installation

It is recommended to use virtualenv to create a clean python environment.

To install lsqAxSI, use pip:

pip install lsqAxSI

Execution

The main script shipped with this project is lsq_AxSI.py, see its options by running:

lsq_AxSI.py -h

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lsqaxsi-0.0.3.tar.gz (28.2 kB view details)

Uploaded Source

File details

Details for the file lsqaxsi-0.0.3.tar.gz.

File metadata

  • Download URL: lsqaxsi-0.0.3.tar.gz
  • Upload date:
  • Size: 28.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for lsqaxsi-0.0.3.tar.gz
Algorithm Hash digest
SHA256 e8a2fb8e305d1c822d7f106a5de622a773e11f7b84fddd02eac7677cc40aeff6
MD5 e07474e7c617417159bcdf3db3a4df8a
BLAKE2b-256 844738a09285c0c419ec7e7f35fb699d8034517dc862e4b478b4b62b85569818

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page