Skip to main content

Implementation of ML Optimization Methods in Python

Project description

Optimization Methods

Methods Discussed

  • Golden Section Search Method
  • BiSection Method
  • Newton Method
  • Secant Method

Golden Section Search Methods

>>> from mloptm.methods import Golden
>>> def f(x):
...     return x**4 - 14*x**3 + 60*x**2 - 70*x

>>> op = Golden(f)
>>> minima = op.Minimize(a0=0, b0=2, eps=0.03)
>>> op.PrintOptimizationSteps()

Using Golden Optimization Medhod
Found Local Minima at x -> [0.777088]
Optimization Steps with [9] Steps
---------------------------------
	  a0        b0        a1        b1    Minima
--------  --------  --------  --------  --------
0         1.23607   0.763932  1.23607   0.618034
0.472136  1.23607   0.472136  0.763932  0.854102
0.472136  0.944272  0.763932  0.944272  0.708204
0.652476  0.944272  0.652476  0.763932  0.798374
0.652476  0.832816  0.763932  0.832816  0.742646
0.72136   0.832816  0.72136   0.763932  0.777088
0.763932  0.832816  0.763932  0.790243  0.798374
0.763932  0.806504  0.790243  0.806504  0.785218
0.763932  0.790243  0.780193  0.790243  0.777088

>>> op.PlotOptimizationSteps(xmin=0, xmax=2)
>>> op.ExportOptimizationSteps(xmin=0, xmax=2, filname="OptimizedFunction")

BiSection Method

>>> from mloptm.methods import BiSection
>>> def f(x):
...     return x**4 - 14*x**3 + 60*x**2 - 70*x

>>> def df(x):                                                                      
...     return 4*x**3 - 14*3*x**2 + 120*x - 70

>>> op = BiSection(f, df)
>>> minima = op.Minimize(a0=0, b0=2, epochs=10)
>>> op.PrintOptimizationSteps()

Using BiSection Optimization Medhod
Found Local Minima at x -> [0.779297]
---------------------------------
	  a0       b0    (a0+b0)/2    f'((a0+b0)/2)
--------  -------  -----------  ---------------
0         1           1              12
0.5       1           0.5           -20
0.75      1           0.75           -1.9375
0.75      0.875       0.875           5.52344
0.75      0.8125      0.8125          1.91895
0.75      0.78125     0.78125         0.022583
0.765625  0.78125     0.765625       -0.949448
0.773438  0.78125     0.773438       -0.461435
0.777344  0.78125     0.777344       -0.218928
0.779297  0.78125     0.779297       -0.0980478

>>> op.PlotOptimizationSteps(xmin=0, xmax=2)
>>> op.ExportOptimizationSteps(xmin=0, xmax=2, filname="OptimizedFunction")

Newton Method

>>> from mloptm.methods import Newton
>>> def f(x):
...     return x**4 - 14*x**3 + 60*x**2 - 70*x

>>> def df(x):
...     return 4*x**3 - 14*3*x**2 + 120*x - 70

>>> def ddf(x):
...     return 12*x**2 - 14*6*x + 120

>>> op = Newton(f, df, ddf)
>>> minima = op.Minimize(x0=0, eps=10**-5)
>>> op.PrintOptimizationSteps()

Using Newton Optimization Method
Found Local Minima at x -> [0.780884]
---------------------------------
	  xk      xk+1       f'(xk+1)    f''(xk+1)
--------  --------  -------------  -----------
0         0.583333  -13.4977           75.0833
0.583333  0.763103   -1.10786          62.8873
0.763103  0.780719   -0.0101707        61.7339
0.780719  0.780884   -8.85683e-07      61.7231
0.780884  0.780884    0                61.7231

>>> op.PlotOptimizationSteps(xmin=0, xmax=2)
>>> op.ExportOptimizationSteps(xmin=0, xmax=2, filname="OptimizedFunction")

Secant Method

>>> from mloptm.methods import Secant
>>> def f(x):
...     return x**4 - 14*x**3 + 60*x**2 - 70*x

>>> def df(x):                                                                      
...     return 4*x**3 - 14*3*x**2 + 120*x - 70

>>> op = Secant(f, df)
>>> minima = op.Minimize(a0=0, b0=2, epochs=10)
>>> op.PrintOptimizationSteps()

Using Newton Optimization Method
Found Local Minima at x -> [0.780884]
---------------------------------
	  xk      xk+1    f(xk+1)       f'(xk+1)
--------  --------  ---------  -------------
0         0.604282   -23.3462  -11.9401
0.1       0.733837   -24.3002   -2.97653
0.604282  0.776858   -24.3691   -0.249017
0.733837  0.780786   -24.3696   -0.00605475
0.776858  0.780884   -24.3696   -1.28637e-05
0.780786  0.780884   -24.3696   -6.67001e-10

>>> op.PlotOptimizationSteps(xmin=0, xmax=2)
>>> p.ExportOptimizationSteps(xmin=0, xmax=2, filname="OptimizedFunction")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mloptm-1.0.4.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

mloptm-1.0.4-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file mloptm-1.0.4.tar.gz.

File metadata

  • Download URL: mloptm-1.0.4.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.12

File hashes

Hashes for mloptm-1.0.4.tar.gz
Algorithm Hash digest
SHA256 c10fd07670729fb230e8c0aa98d1b75149da754053d623dc4bd79a1a549918bf
MD5 1dc99b9e8d404862778f457b028e42fa
BLAKE2b-256 bae4bf0419d57a78d75561ddf3d98200067b62cfd334046f3856a3e036e89575

See more details on using hashes here.

File details

Details for the file mloptm-1.0.4-py3-none-any.whl.

File metadata

  • Download URL: mloptm-1.0.4-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.12

File hashes

Hashes for mloptm-1.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 923919d96c79c397787f0fd71b1adb27987a8aa1f8b721890d0aba68717e883b
MD5 421538d1fd21fb9ad2f9d6474d31df9e
BLAKE2b-256 2a97cc1971663af157b4c17a1db4dc2d8c8d993b7d8c60ed7cefbd518d2bfd9e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page