Skip to main content

OMLT is a Python package for representing machine learning models (such as neural networks) within the Pyomo optimization environment.

Project description

OMLT CI Status https://codecov.io/gh/cog-imperial/OMLT/branch/main/graph/badge.svg?token=9U7WLDINJJ Documentation Status COIN

OMLT: Optimization and Machine Learning Toolkit

OMLT is a Python package for representing machine learning models (neural networks and gradient-boosted trees) within the Pyomo optimization environment. The package provides various optimization formulations for machine learning models (such as full-space, reduced-space, and MILP) as well as an interface to import sequential Keras and general ONNX models.

Please reference the paper for this software package as:

@article{ceccon2022omlt,
     title={OMLT: Optimization & Machine Learning Toolkit},
     author={Ceccon, F. and Jalving, J. and Haddad, J. and Thebelt, A. and Tsay, C. and Laird, C. D and Misener, R.},
     journal={Journal of Machine Learning Research},
     volume={23},
     number={349},
     pages={1--8},
     year={2022}
}

When utilizing linear model decision trees, please cite the following paper in addition:

@article{ammari2023,
     title={Linear Model Decision Trees as Surrogates in Optimization of Engineering Applications},
     author= {Bashar L. Ammari and Emma S. Johnson and Georgia Stinchfield and Taehun Kim and Michael Bynum and William E. Hart and Joshua Pulsipher and Carl D. Laird},
     journal={Computers \& Chemical Engineering},
     volume = {178},
     year = {2023},
     issn = {0098-1354},
     doi = {https://doi.org/10.1016/j.compchemeng.2023.108347}
}

When utilizing graph neural networks, please cite the following paper in addition:

@article{zhang2024,
     title = {Augmenting optimization-based molecular design with graph neural networks},
     author= {Shiqiang Zhang and Juan S. Campos and Christian Feldmann and Frederik Sandfort and Miriam Mathea and Ruth Misener},
     journal = {Computers \& Chemical Engineering},
     volume = {186},
     pages = {108684},
     year = {2024},
     issn = {0098-1354},
     doi = {https://doi.org/10.1016/j.compchemeng.2024.108684},
}

Documentation

The latest OMLT documentation can be found at the readthedocs page. Additionally, much of the current functionality is demonstrated using Jupyter notebooks available in the notebooks folder.

Example

import tensorflow
import pyomo.environ as pyo
from omlt import OmltBlock, OffsetScaling
from omlt.neuralnet import FullSpaceNNFormulation, NetworkDefinition
from omlt.io import load_keras_sequential

#load a Keras model
nn = tensorflow.keras.models.load_model('tests/models/keras_linear_131_sigmoid', compile=False)

#create a Pyomo model with an OMLT block
model = pyo.ConcreteModel()
model.nn = OmltBlock()

#the neural net contains one input and one output
model.input = pyo.Var()
model.output = pyo.Var()

#apply simple offset scaling for the input and output
scale_x = (1, 0.5)       #(mean,stdev) of the input
scale_y = (-0.25, 0.125) #(mean,stdev) of the output
scaler = OffsetScaling(offset_inputs=[scale_x[0]],
                    factor_inputs=[scale_x[1]],
                    offset_outputs=[scale_y[0]],
                    factor_outputs=[scale_y[1]])

#provide bounds on the input variable (e.g. from training)
scaled_input_bounds = {0:(0,5)}

#load the keras model into a network definition
net = load_keras_sequential(nn,scaler,scaled_input_bounds)

#multiple formulations of a neural network are possible
#this uses the default NeuralNetworkFormulation object
formulation = FullSpaceNNFormulation(net)

#build the formulation on the OMLT block
model.nn.build_formulation(formulation)

#query inputs and outputs, as well as scaled inputs and outputs
model.nn.inputs.display()
model.nn.outputs.display()
model.nn.scaled_inputs.display()
model.nn.scaled_outputs.display()

#connect pyomo model input and output to the neural network
@model.Constraint()
def connect_input(mdl):
    return mdl.input == mdl.nn.inputs[0]

@model.Constraint()
def connect_output(mdl):
    return mdl.output == mdl.nn.outputs[0]

#solve an inverse problem to find that input that most closely matches the output value of 0.5
model.obj = pyo.Objective(expr=(model.output - 0.5)**2)
status = pyo.SolverFactory('ipopt').solve(model, tee=False)
print(pyo.value(model.input))
print(pyo.value(model.output))

Development

OMLT uses just to manage development tasks:

  • just to list available tasks

  • just check to run all checks

  • just fix to apply any auto-fixes

  • just dev to install development dependencies in your current Python environment

  • just dev-gpu same as dev but with GPU support

  • just docs to build the documentation

OMLT also includes a workflow for publishing new releases. This workflow can be triggered by pushing a new tag with an updated version number:

git tag <version> # e.g. git tag v1.2.0
git push upstream --tags

Contributors

GitHub

Name

Acknowledgements

jalving

Jordan Jalving

This work was funded by Sandia National Laboratories, Laboratory Directed Research and Development program.

fracek

Francesco Ceccon

This work was funded by an Engineering & Physical Sciences Research Council Research Fellowship [GrantNumber EP/P016871/1].

carldlaird

Carl D. Laird

Initial work was funded by Sandia National Laboratories, Laboratory Directed Research and Development program. Current work supported by Carnegie Mellon University.

tsaycal

Calvin Tsay

This work was funded by an Engineering & Physical Sciences Research Council Research Fellowship [GrantNumber EP/T001577/1], with additional support from an Imperial College Research Fellowship.

thebtron

Alexander Thebelt

This work was supported by BASF SE, Ludwigshafen am Rhein.

bammari

Bashar L. Ammari

This work was funded by Sandia National Laboratories, Laboratory Directed Research and Development program.

juan-campos

Juan S. Campos

This work was funded by an Engineering & Physical Sciences Research Council Research Fellowship [GrantNumber EP/W003317/1].

zshiqiang

Shiqiang Zhang

This work was funded by an Imperial College Hans Rausing PhD Scholarship.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omlt-1.2.2.tar.gz (2.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omlt-1.2.2-py3-none-any.whl (57.3 kB view details)

Uploaded Python 3

File details

Details for the file omlt-1.2.2.tar.gz.

File metadata

  • Download URL: omlt-1.2.2.tar.gz
  • Upload date:
  • Size: 2.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for omlt-1.2.2.tar.gz
Algorithm Hash digest
SHA256 011ea5ee42033cf8b12f9ae30106d126d13e074ec97ea313aec9578682d83a14
MD5 f7b789560008f430bdc2b877c05172db
BLAKE2b-256 478b22cdf32fd76555a6c32337ce52d311bbab9b0b71d45e5ae4b1e558c4fbe5

See more details on using hashes here.

File details

Details for the file omlt-1.2.2-py3-none-any.whl.

File metadata

  • Download URL: omlt-1.2.2-py3-none-any.whl
  • Upload date:
  • Size: 57.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for omlt-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 673b60b69b01ccfa1391625027801c8a34037ca3f958648b53b215a23d9bd143
MD5 c5c5b6d890cca64e1fdfb2cbdc0d72ab
BLAKE2b-256 73c3c55fab1f3cdc66eeb762bbf60f30f830179c67ceb551ba00ddc5dfa85c50

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page