Skip to main content

A collection of optimization algorithms

Project description

lab_optimizer, with latest version 1.2.4

optimization algorithms packages

  • Provides :

    1. global_optimizer, include algorithms for finding global minimun
    2. local_optimizer, include algorithms for finding local minimun
    3. mloop_optimizer, a general and integral API of its functions
    4. torch_optimizer (only for torch functions), a general and integral API of its functions
    5. powerful visualization tools
    6. physics constants and units conversion
    7. lab_optimizer examples
    8. flexible custom opt algorithm API
  • to download this package, using

pip install lab_optimizer

Quick Start

  • basic set :
## general form
opt = XXX_optimize(func,paras_init,bounds,args)
x_opt = opt.optimization() # x_opt is the optimization result
opt.visualization()
"""
Args
--------
fun : callable
    The objective function to be minimized.

        ``fun(x, *args) -> dict : {'cost':float, 'uncer':float, 'bad':bool}``
        
    where ``cost`` is the value to minimize, ``uncer`` is uncertainty,
    ``bad`` is the judge whether this value is bad (bad = True) for this cost
    
    f you set val_only = True, then you can set bad and uncer to anything because they will not be used and default is True

    ``x`` is a 1-D array with shape (n,) and ``args``
    is a tuple of the fixed parameters needed to completely
    specify the function.

paras_init : ndarray, shape (n,)
    Initial guess. Array of real elements of size (n,),
    where ``n`` is the number of independent variables.

args : tuple, optional
    Extra arguments passed to the objective function which will not
    change during optimization
    
bounds : sequence or `Bounds`, optional
    Bounds on variables for Nelder-Mead, L-BFGS-B, TNC, SLSQP, Powell,
    trust-constr, COBYLA, and COBYQA methods. To specify
    the bounds:

        Sequence of ``(min, max)`` pairs for each element in `x`. None is used to specify no bound.

kwArgs
---------
ave_dict : dict
    - ave : Bool
        whethr to use average
    - ave_times : int
        average times
    - ave_wait
        wait times during each ave_run
    - ave_opt
        average operation code, defeault is "ave"
        - "ave" : following cost_dict
        - "std" : use for val_only func, it will cal uncer automatedly
        
    defeault is {False, X, X, X}
    if you set ave == True, then defeault is {True, 3, 0.01,"ave"}

extra_dict : dict
    used for extra parameters for scipy.optimize.minimize family such as jac, hessel ... 

method : str 
    optimization algorithm to use

delay : float 
    delay of each iteration, default is 0.1s

max_run : int 
    maxmun times of running optimization

msg : Bool
    whether to output massages in every iterarion, default is True
    
log : Bool
    whether to generate a log file in folder labopt_logs
    
logfile : str
    log file name , defeault is "optimization__ + <timestamp>__ + <method>__.txt"
    level lower than inherited logfile
    
opt_inherit : opt_class 
    inherit ``optimization results``, ``parameters`` and ``logs``
    defeault is None (not use inherit)
"""

## get example code
from lab_optimizer import examples
examples(opcode = "direct_opt") # direct_opt, inherit_opt, log
  • using examples :

    • do not use opt_inherit
    from lab_optimizer import global_optimize
    opt = global_optimize(func,paras_init,bounds,args)
    x_opt = opt.optimization()
    opt.visualization()
    
    • use opt_inherit (cascade multi optimizers)
    from lab_optimizer import global_optimize
    opt1 = global_optimize(func,paras_init,bounds,args,log = "inherit")
    x_opt1 = opt1.optimization()
    # x_opt1 = opt.x_optimize ## you can also use this one
    opt2 = global_optimize(func,x_opt1,bounds,args,opt_inherit = opt1) # paras_init will be automatically set to x_opt1 
    opt2.optimization()
    opt2.visualization()
    
    • Generic functional interface

    to use another optimization algorithm, you only need to change the opt_class and just a little about its args

    from lab_optimizer import global_optimize, local_optimize
    opt1 = global_optimize(func,paras_init,bounds,args,log = "inherit")
    x_opt1 = opt1.optimization()
    opt2 = local_optimize(func,x_opt1,bounds,args,opt_inherit = opt1) # just change opt_class from global_opt to local_opt
    opt2.optimization()
    opt2.visualization()
    
  • units module

from lab_optimizer import units
"""
then you can easily use physics constants like :

- planck constant : units.h_const
- velocity of light in vacuum : units.c0_const

and do units conversion :  

- freq = 100*units.THz = 1e5*units.GHz = 1e8*units.MHz = ... # freq = 1e14 Hz
- m = 100*units.kg = 1e5*units.g # m = 100[kg]

the units module uses SI units (kg,m,s)
"""

Documentation

  • local_optimize :

    local_optimize aims at finding local minimum of a function, the local_optimize submodule is constructed based on scipy.optimize.minimize, including all of its supported algorithms :

    • Nelder-Mead (defeault)
    • L-BFGS-B
    • Powell
    • ...
  • global_optimize :

    global_optimize aims at finding global minimum of a function, the global_optimize submodule is constructed based on scipy.optimize and scikit-opt including some powerful algorithms

  • mloop_optimize :

    mloop_optimize inherits all functions of M-LOOP :

    • gaussian_process (defeault)
    • neural_net
    • differential_evolution
    • Nelder-Mead
    • Random
  • torch_optimize :

    torch_optimize aims at optimizing explicit function (which can be expressed explicitly in your code instead of experiment results), using torch based gradient optimization algorithms family

    • 'ASGD' (defeault)
    • 'SGD'
    • 'RMSprop'
    • 'LBFGS' (require extra parameters in extra_dict)
    • 'Rprop'
    • 'Adadelta'
    • 'Adagrade' (cpu_only)
    • 'Adam'
    • 'NAdam'
    • 'RAdam'
    • 'AdamW'
    • 'Adamax'

ReleaseNotes

  • 1.1.x

    add advanced visualization tools :

  • 1.2.x

    add functions to handel optimizations Exceptions

    add optimizer extensions, provide a general interface of custom defined opt algorithms

    add optimizer test function library:

    add opt algorithms library:

  • 1.3.x(building)

    add multiprocessings ...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lab_optimizer-1.2.4.tar.gz (28.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lab_optimizer-1.2.4-py3-none-any.whl (33.4 kB view details)

Uploaded Python 3

File details

Details for the file lab_optimizer-1.2.4.tar.gz.

File metadata

  • Download URL: lab_optimizer-1.2.4.tar.gz
  • Upload date:
  • Size: 28.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for lab_optimizer-1.2.4.tar.gz
Algorithm Hash digest
SHA256 ce343bd70508a304a3540f1a60e2b7fad11e8c125a16c5e8b1af50a9fad8dead
MD5 73241833e04ae50e72b9474fd2fed2a2
BLAKE2b-256 32c374b7af7f06e479968d4aec977094eec8e861a355f2688b2202c7721518e3

See more details on using hashes here.

File details

Details for the file lab_optimizer-1.2.4-py3-none-any.whl.

File metadata

  • Download URL: lab_optimizer-1.2.4-py3-none-any.whl
  • Upload date:
  • Size: 33.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for lab_optimizer-1.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 267a7af5784b968cf94b05614f2fdc2022b402677c3def02eeaf389816cfcd1a
MD5 afe311b5ab7e16daa1e266b7171e7b96
BLAKE2b-256 f98082ff25791da6068ab787951717b65d322d86f8b68e2396e08a4185c37ffe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page