Skip to main content

NumAn_Op (stands for Numerical Analysis & Optimization) is a python package that contains some optimization algorithms.

Project description

Numerical_Analysis_Optimization_Package

Python package that contains some numerical analysis & optimization algorithms.

Instructions:

  1. Install: Run this command in your terminal:

    
    pip install NumAn-Op
    
    
  2. Modules: There are 3 modules in this package:

    To use them you can import them as following:

    
    from NumAn_Op import one_dim_min
    
    from NumAn_Op import sys_eq
    
    from NumAn_op import multi_dim_min
    
    

    After importing the modules, you can use the help() function to get information about the modules and the functions that it contains.

Following are the algorithms present in this package:

I. One dimensional function minimization algorithms:

  • Searching with elimination methods
    • Unrestricted search
    • Exhaustive search
    • Dichotomous search
    • Interval halving method
    • Fibonacci method
    • Golden section method
  • Searching with interpolation methods
    • Newton-Rapson method
    • Quasi-Newton method
    • Secant method

II. System of Equations & Decompositions:

  • The Elimination Of Gauss-Jordan
  • LU Decomposition Method
  • Cholesky Decomposition Method

III. Multi-dimensional function minimization algorithms:

  • Gradient methods
    • Gradient Descent method
    • Conjugate Gradient method
    • AdaGrad
  • Newton methods
    • Newton method
    • Quasi-Newton with DFP and armijo

Visualization of some the progress of some algorithms

Following are some plots visualizing the progress of some algorithms that the package contains. You can find all the scripts to make them in the Plotting_Scripts folder in this repository.

  1. One dimensional function minimization comparison:

    • Elimination methods comparison

      Function Optimization Comparison. (Elimination Methods)

    • Interpolation methods comparison

      Function Optimization Comparison. (Interpolation Methods)

  2. Mutli-Variable function minimization comparison:

    Multi-dimensional function minimization algorithms comparison

Note: You won't get the same progess path for the Adagrad method if you try to use the scripts in the Plotting_Scripts folder, and this is due to the fact that Adagrad is a variant of the stochastic gradient descent method, this means that it takes a random starting point each time.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

NumAn_Op-0.0.2.tar.gz (8.0 kB view details)

Uploaded Source

Built Distribution

NumAn_Op-0.0.2-py3-none-any.whl (8.9 kB view details)

Uploaded Python 3

File details

Details for the file NumAn_Op-0.0.2.tar.gz.

File metadata

  • Download URL: NumAn_Op-0.0.2.tar.gz
  • Upload date:
  • Size: 8.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.1 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.0

File hashes

Hashes for NumAn_Op-0.0.2.tar.gz
Algorithm Hash digest
SHA256 042d789ee07061d83a693eec47fe9c0706d0801ac177ba70371ae535c5bec396
MD5 6e048903e1d4818106b64cd71e510991
BLAKE2b-256 a0bca96c2244b6ee9a3253b22ba0475bf80b0e1b1b6e37522c3da09a131828ee

See more details on using hashes here.

File details

Details for the file NumAn_Op-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: NumAn_Op-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 8.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.1 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.0

File hashes

Hashes for NumAn_Op-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 5375b0b1c7c79477c87442ab1dde15ef2c90d7072626d2907ba8b019720bb960
MD5 d1518b451b395aff998cf972c8e7ba93
BLAKE2b-256 c8a1c633a2c7652993a8a99cccc002dc16bd23a5e85d18f3650f9df72d990b44

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page