Skip to main content

A framework for optimization.

Project description

Optimization Framework

This is a framework for linear and non-linear optimization developed by ENACOM Group.

It solves a problem in the form:

min  f(x)

s.t. g(x) <= 0
     h(x)  = 0
     x_min <= x <= x_max

where x is an n-dimensional vector, f(x) maps the n-dimensional space to the o-dimensional space, with o being the number of objectives.

Prerequisites

What you need to be able to run:

  • Python 3.6 +

All requirements are in the file requirements.txt. To install all packages at once:

$ pip install -r requirements.txt

Installing science-optimization

Building science-optimization requires the following software installed:

  1. Python 3
    • Make sure that distutils is installed before continuing. For example in Debian GNU/Linux, disutils is included in the python3-dev package.
  2. PyBind11
    • Note that this is PyBind11, not PyBind.
  3. GNU C Compiler version >= 5.0

To install the package in any environment:

  • clone this repo
  • $ pip install ./science-optimization

Conventions

Development

  • In Python programming, it is a common practice to initialize "private" methods/attributes with an underscore (e.g. _private_method(), _private_attribute). This framework follows this convention

  • Attributes sets and gets are implemented using the native @property decorator

  • An abstract base class inherits from abc.ABCMeta

  • Google is the default Docstring format

  • A package must contain an __init__.py file with the package initialization

  • PyCharm 2017+ is highly recommended

A detailed explanation of code development can be found here and here.

Tutorials on Python programming can be found here.

Optimization

Given that n is the problem dimension and m is the number of points:

  • A single point is a nx1 matrix

  • Multiple points are a nxm matrix

  • Gradient is a nxm matrix

  • Hessian is a nxnxm matrix

Packages

Builder

This package is responsible for building an optimization problem. It was built based on the Builder Design Pattern and it has the following classes.

  1. OptimizationProblem(): the base class of every optimization problem.

    • Attributes:

      • objective: Objective() instance

      • constraints: Contraints() instance

      • variables: Variables() instance

    • Methods:

      • _build_problem(builder): receives a BuilderOptimizationProblem() instance, checks problem consistency and sets attributes

      • _check_consistency(variables, objectives, constraints): receives attributes and check problem consistency

      • info(): prints to user the assembled problem

      • __call__(): overloaded method that turns an OptimizationProblem instance a callable function, that evaluates f, g, h, df, dg, dh, hf, hg, hh for a given point x

  2. BuilderOptimizationProblem(): abstract class responsible for defining the builder's methods of the optimization problem.

    • Abstract methods:

      • build_objectives(): abstract method responsible for building the problem's objectives

      • build_constraints(): abstract method responsible for building the problem's constraints

      • build_variables(): abstract method responsible for building the problem's variables

  3. Objective(): container class of objectives functions.

    • Attributes:

      • objective_functions: a list with all functions of the problem. Each function is a FunctionComposite() instance.
    • Methods:

      • C(): return the matrix C of linear objectives coefficients

      • d(): return the vector d of linear objectives constants

  4. Contraints(): container class of constraints functions.

    • Attributes:

      • equality_contraints: a list with equality constraints functions. Each function is a FunctionComposite() instance.

      • inequality_contraints: a list with inequality constraints functions. Each function is a FunctionComposite() instance.

    • Methods:

      • equality_contraints_feasibility(x): evaluates the feasibility of x on these constraints

      • inequality_contraints_feasibility(x): evaluates the feasibility of x on these constraints

      • A(): returns the matrix A of linear inequality constraints coefficients

      • b(): returns the vector b of linear inequality constraints bounds

      • Aeq(): returns the matrix Aeq of linear equality constraints coefficients

      • beq(): returns the vector beq of linear equality constraints bounds

  5. Variables(): container class of the problem variables.

    • Attributes:

      • x_min: variables' lower bounds

      • x_max: variables' upper bounds

      • x_type: list with variables' type ('c': continuous or 'd': discrete)

Function

This package has the definitions of a function. Is was base on the Composite design pattern and contains the following classes:

  1. BaseFunction(): base class of every other class inside this package.

    • Attributes:

      • parameters: abstract attribute for functions parameters (e.g. coefficients)

      • eps: precision for numerical calculations

    • Methods:

      • eval(x): abstract method for the evaluation of x

      • gradient(x): numerical calculation of the Gradient of x

      • hessian(x): numerical calculation of the Hessian of x

      • dimension(): function dimension n

  2. FunctionsComposite(): container class of BaseFunction()'s children.

    • Attributes:

      • functions: a list of functions
    • Methods:

      • eval(x): abstract method for the evaluation of x in functions

      • gradient(x): numerical calculation of the Gradient of x in functions

      • hessian(x): numerical calculation of the Hessian of x in functions

      • add(f): adds function f to functions

      • remove(idx): removes element functions[idx]

      • clear(): removes all functions from list

  3. LinearFunction(): implements a function in the form f(x) = c'x + d

    • Attributes:

      • parameters: dictionary with c and d keys
    • Methods:

      • eval(x): method for the evaluation of x

      • gradient(x): analytical calculation of the Gradient of x

      • hessian(x): analytical calculation of the Hessian of x

  4. QuadraticFunction(): implements a function in the form f(x) = x'Qx + c'x + d

    • Attributes:

      • parameters: dictionary with Q, c and d keys
    • Methods:

      • eval(x): method for the evaluation of x

      • gradient(x): analytical calculation of the Gradient of x

      • hessian(x): analytical calculation of the Hessian of x

  5. PolynomialFunction(): implements a polynomial function

    • Attributes:

      • parameters: dictionary with exponents e and c coefficients
    • Methods:

      • eval(x): method for the evaluation of x

      • gradient(x): analytical calculation of the Gradient of x

      • hessian(x): analytical calculation of the Hessian of x

Problems

This package is responsible for creating the interface of BuilderOptimizationProblem() for each optimization problem instance. The classes must follow the format:

  1. Problem(BuilderOptimizationProblem): inherits from BuilderOptimizationProblem

    • Attributes: necessary problem attributes

    • Methods:

      • build_objectives(): concrete method responsible for building the problem's objectives

      • build_constraints(): concrete method responsible for building the problem's constraints

      • build_variables(): concrete method responsible for building the problem's variables

The class Generic(BuilderOptimizationProblem) builds a generic optimization problem in the canonical format defined at the beginning of this document. However, it also possible to implement customized optimization problems inheriting from BuilderOptimizationProblem class, such as Diet and KleeMinty.

For linear programming, the problem MIP is a straightforward abstraction of a problem in the format:

    min  c'  @ x
    s.t. A   @ x <= b
         Aeq @ x == beq
         x_min  <= x <=  x_max

Algorithms

This package contains the implementations of the optimization algorithms. It is organized in sub-packages according to algorithm's families. Each sub-package contains one abstract implementation of the algorithm's family, it is named BaseAlgorithmFamily()

  1. BaseAlgorithms(): base class for every BaseAlgorithmFamily()

    • Attributes:

      • eps: algorithms' numerical precision
    • Methods:

      • optimize(): abstract method for the implementation core

This framework currently supports the following optimization algorithms:

  • Cutting-plane family:

    • Ellipsoidal (Multiobjective)
  • Decomposition :

    • Nonlinear dual decomposition
  • Derivative-free :

    • Nelder-Mead simplex (constrained)
  • Lagrange (constrained):

    • Augmented lagrangian method (using Quasi Newton)
  • Linear programming:

    • Scipy simplex and interior-point
    • GLOP for LP
    • CBC for MIP
  • Unidimensional search:

    • Enhanced golden section
    • Multimodal golden section
  • Search direction family (unconstrained):

    • Gradient method
    • Modified Newton method
    • Quasi Newton method

Solvers

This package contains the interface to optimization solvers implemented in the framework.

  1. Optimizer(): optimization for built-in methods, i.e. algorithms.

    • Attributes

      • optimization_problem: OptimizationProblem() instance

      • algorithm: a concrete algorithm implementation instance (e.g. GradientAlgorithm())

    • Methods

      • solve: implements algorithm.optimize()
  2. OptimizationResults(): container for optimization results

    • Attributes

      • x: the solution of the optimization

      • fx: function value at solution point

      • sucess: whether or not the solvers exited successfully

      • message: description of the cause of termination

      • n_iterations: number of iterations

      • n_function_evaluations: number of function evaluations

    • Methods

      • info(): displays optimization results info.
  3. pareto_samplers package: used to find the pareto border of a multiobjective problem, the implemented sampling strategies are:

    • Lambda sampler
    • Epsilon sampler
    • Nondominated sampler
    • Mu sampler

Examples

This package contains modules implementing examples of building and solving an optimization problem. It can also be used for profiling via @profiling decorator.

Profiling

Implementation of the @profiling decorator.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

science_optimization-9.0.3.tar.gz (255.2 kB view details)

Uploaded Source

Built Distribution

science_optimization-9.0.3-cp38-cp38-manylinux_2_31_x86_64.whl (402.1 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.31+ x86-64

File details

Details for the file science_optimization-9.0.3.tar.gz.

File metadata

  • Download URL: science_optimization-9.0.3.tar.gz
  • Upload date:
  • Size: 255.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.8.10 Linux/5.15.0-83-generic

File hashes

Hashes for science_optimization-9.0.3.tar.gz
Algorithm Hash digest
SHA256 18369b81fdb070953b2b5c60c0babb89f9b6d0d29058c1774832aea42c9a90ca
MD5 bb0f0d4f6bf7457920ddbfb0f5934e29
BLAKE2b-256 fb12a134d46c7487052b07599a5cccbafc1f1f41520710a3d154e3566fa6acf0

See more details on using hashes here.

File details

Details for the file science_optimization-9.0.3-cp38-cp38-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for science_optimization-9.0.3-cp38-cp38-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 d1e3b448fddc793ed6f2bad2cb3c4c4ab91a465d89a34ef3dc194da2300b4d4b
MD5 551c04a8baf60f39671e46a53284160b
BLAKE2b-256 9d6840156e7fa918dabb9b7d9f9bb131b7be0577232ce9a2bba91e469f54b952

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page