A collection of optimization algorithms
Project description
lab_optimizer, with latest version 1.2.3
optimization algorithms packages
-
Provides :
- global_optimizer, include algorithms for finding global minimun
- local_optimizer, include algorithms for finding local minimun
- mloop_optimizer, a general and integral API of its functions
- torch_optimizer (only for torch functions), a general and integral API of its functions
- powerful visualization tools
- physics constants and units conversion
- lab_optimizer examples
- flexible custom opt algorithm API
-
to download this package, using
pip install lab_optimizer
Quick Start
- basic set :
## general form
opt = XXX_optimize(func,paras_init,bounds,args)
x_opt = opt.optimization() # x_opt is the optimization result
opt.visualization()
"""
Args
--------
fun : callable
The objective function to be minimized.
``fun(x, *args) -> dict : {'cost':float, 'uncer':float, 'bad':bool}``
where ``cost`` is the value to minimize, ``uncer`` is uncertainty,
``bad`` is the judge whether this value is bad (bad = True) for this cost
f you set val_only = True, then you can set bad and uncer to anything because they will not be used and default is True
``x`` is a 1-D array with shape (n,) and ``args``
is a tuple of the fixed parameters needed to completely
specify the function.
paras_init : ndarray, shape (n,)
Initial guess. Array of real elements of size (n,),
where ``n`` is the number of independent variables.
args : tuple, optional
Extra arguments passed to the objective function which will not
change during optimization
bounds : sequence or `Bounds`, optional
Bounds on variables for Nelder-Mead, L-BFGS-B, TNC, SLSQP, Powell,
trust-constr, COBYLA, and COBYQA methods. To specify
the bounds:
Sequence of ``(min, max)`` pairs for each element in `x`. None is used to specify no bound.
kwArgs
---------
ave_dict : dict
- ave : Bool
whethr to use average
- ave_times : int
average times
- ave_wait
wait times during each ave_run
- ave_opt
average operation code, defeault is "ave"
- "ave" : following cost_dict
- "std" : use for val_only func, it will cal uncer automatedly
defeault is {False, X, X, X}
if you set ave == True, then defeault is {True, 3, 0.01,"ave"}
extra_dict : dict
used for extra parameters for scipy.optimize.minimize family such as jac, hessel ...
method : str
optimization algorithm to use
delay : float
delay of each iteration, default is 0.1s
max_run : int
maxmun times of running optimization
msg : Bool
whether to output massages in every iterarion, default is True
log : Bool
whether to generate a log file in folder labopt_logs
logfile : str
log file name , defeault is "optimization__ + <timestamp>__ + <method>__.txt"
level lower than inherited logfile
opt_inherit : opt_class
inherit ``optimization results``, ``parameters`` and ``logs``
defeault is None (not use inherit)
"""
## get example code
from lab_optimizer import examples
examples(opcode = "direct_opt") # direct_opt, inherit_opt, log
-
using examples :
- do not use opt_inherit
from lab_optimizer import global_optimize opt = global_optimize(func,paras_init,bounds,args) x_opt = opt.optimization() opt.visualization()
- use opt_inherit (cascade multi optimizers)
from lab_optimizer import global_optimize opt1 = global_optimize(func,paras_init,bounds,args,log = "inherit") x_opt1 = opt1.optimization() # x_opt1 = opt.x_optimize ## you can also use this one opt2 = global_optimize(func,x_opt1,bounds,args,opt_inherit = opt1) # paras_init will be automatically set to x_opt1 opt2.optimization() opt2.visualization()
- Generic functional interface
to use another optimization algorithm, you only need to change the opt_class and just a little about its args
from lab_optimizer import global_optimize, local_optimize opt1 = global_optimize(func,paras_init,bounds,args,log = "inherit") x_opt1 = opt1.optimization() opt2 = local_optimize(func,x_opt1,bounds,args,opt_inherit = opt1) # just change opt_class from global_opt to local_opt opt2.optimization() opt2.visualization()
-
units module
from lab_optimizer import units
"""
then you can easily use physics constants like :
- planck constant : units.h_const
- velocity of light in vacuum : units.c0_const
and do units conversion :
- freq = 100*units.THz = 1e5*units.GHz = 1e8*units.MHz = ... # freq = 1e14 Hz
- m = 100*units.kg = 1e5*units.g # m = 100[kg]
the units module uses SI units (kg,m,s)
"""
Documentation
-
local_optimize :
local_optimize aims at finding local minimum of a function, the local_optimize submodule is constructed based on scipy.optimize.minimize, including all of its supported algorithms :
- Nelder-Mead (defeault)
- L-BFGS-B
- Powell
- ...
-
global_optimize :
global_optimize aims at finding global minimum of a function, the global_optimize submodule is constructed based on scipy.optimize and scikit-opt including some powerful algorithms
-
based on scipy.optimize
- dual_annealing (defeault)
- differential_evolution
- direct
- shgo
-
- genetic
- particle_swarm
- artificial_fish
-
-
mloop_optimize :
mloop_optimize inherits all functions of M-LOOP :
- gaussian_process (defeault)
- neural_net
- differential_evolution
- Nelder-Mead
- Random
-
torch_optimize :
torch_optimize aims at optimizing explicit function (which can be expressed explicitly in your code instead of experiment results), using torch based gradient optimization algorithms family
- 'ASGD' (defeault)
- 'SGD'
- 'RMSprop'
- 'LBFGS' (require extra parameters in extra_dict)
- 'Rprop'
- 'Adadelta'
- 'Adagrade' (cpu_only)
- 'Adam'
- 'NAdam'
- 'RAdam'
- 'AdamW'
- 'Adamax'
ReleaseNotes
-
1.1.x
add advanced visualization tools :
-
1.2.x
add functions to handel optimizations Exceptions
add optimizer extensions, provide a general interface of custom defined opt algorithms
add optimizer test function library:
add opt algorithms library:
- Improved Slime Mould Algorithm (ISMA) @ global , an introduction of Slime Mould Algorithm
-
1.3.x(building)
add multiprocessings ...
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lab_optimizer-1.2.3.tar.gz.
File metadata
- Download URL: lab_optimizer-1.2.3.tar.gz
- Upload date:
- Size: 28.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
776eae6fa813ee1d77ba9609afa925b87ab324caa9787c159503bef1a1a8158c
|
|
| MD5 |
82a474df7c60de1f1fa497e071e8d5d9
|
|
| BLAKE2b-256 |
e07f3892723ab22d8486fc90259ba6cd5b7855026e12e820f9ba3bd60118611b
|
File details
Details for the file lab_optimizer-1.2.3-py3-none-any.whl.
File metadata
- Download URL: lab_optimizer-1.2.3-py3-none-any.whl
- Upload date:
- Size: 32.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9cc71c78bc9e2591f665cdc56b6572ffdab63e56e13117647222d24d0fdda786
|
|
| MD5 |
f12c6177710f16633844a14c1e29ac6b
|
|
| BLAKE2b-256 |
b1b5e2bd799b9c3de0b271800e1c2eadc1afa3677a2b339f4322f4247e0a888a
|