Skip to main content

Converts Machine Learning models to ONNX

Project description

Linux Windows
Build Status Build Status

Introduction

ONNXMLTools enables you to convert models from different machine learning toolkits into ONNX. Currently the following toolkits are supported:

  • Apple Core ML
  • scikit-learn (subset of models convertible to ONNX)
  • Keras
  • Spark ML (experimental)
  • LightGBM
  • libsvm
  • XGBoost

To convert Tensorflow models to ONNX, see tensorflow-onnx.

Install

You can install latest release of ONNXMLTools from PyPi:

pip install onnxmltools

or install from source:

pip install git+https://github.com/onnx/onnxmltools

If you choose to install onnxmltools from its source code, you must set the environment variable ONNX_ML=1 before installing the onnx package.

Dependencies

This package relies on ONNX, NumPy, and ProtoBuf. If you are converting a model from scikit-learn, Core ML, Keras, or LightGBM, you will need an environment with the respective package installed from the list below:

  1. scikit-learn
  2. CoreMLTools
  3. Keras (version 2.0.8 or higher) with the corresponding Tensorflow version
  4. LightGBM (scikit-learn interface)
  5. SparkML (pyspark version 2.3.3 only)
  6. XGBoost (scikit-learn interface)
  7. libsvm

Examples

If you want the converted ONNX model to be compatible with a certain ONNX version, please specify the target_opset parameter upon invoking the convert function. The following Keras model conversion example demonstrates this below. You can identify the mapping from ONNX Operator Sets (referred to as opsets) to ONNX releases in the versioning documentation.

CoreML to ONNX Conversion

Here is a simple code snippet to convert a Core ML model into an ONNX model.

import onnxmltools
import coremltools

# Load a Core ML model
coreml_model = coremltools.utils.load_spec('example.mlmodel')

# Convert the Core ML model into ONNX
onnx_model = onnxmltools.convert_coreml(coreml_model, 'Example Model')

# Save as text
onnxmltools.utils.save_text(onnx_model, 'example.json')

# Save as protobuf
onnxmltools.utils.save_model(onnx_model, 'example.onnx')

Keras to ONNX Conversion

Next, we show an example of converting a Keras model into an ONNX model with target_opset=7, which corresponds to ONNX release version 1.2.

import onnxmltools
from keras.layers import Input, Dense, Add
from keras.models import Model

# N: batch size, C: sub-model input dimension, D: final model's input dimension
N, C, D = 2, 3, 3

# Define a sub-model, it will become a part of our final model
sub_input1 = Input(shape=(C,))
sub_mapped1 = Dense(D)(sub_input1)
sub_model1 = Model(inputs=sub_input1, outputs=sub_mapped1)

# Define another sub-model, it will become a part of our final model
sub_input2 = Input(shape=(C,))
sub_mapped2 = Dense(D)(sub_input2)
sub_model2 = Model(inputs=sub_input2, outputs=sub_mapped2)

# Define a model built upon the previous two sub-models
input1 = Input(shape=(D,))
input2 = Input(shape=(D,))
mapped1_2 = sub_model1(input1)
mapped2_2 = sub_model2(input2)
sub_sum = Add()([mapped1_2, mapped2_2])
keras_model = Model(inputs=[input1, input2], output=sub_sum)

# Convert it! The target_opset parameter is optional.
onnx_model = onnxmltools.convert_keras(keras_model, target_opset=7) 

Spark ML to ONNX Conversion

Please refer to the following documents:

Testing model converters

onnxmltools converts models into the ONNX format which can be then used to compute predictions with the backend of your choice.

Checking the operator set version of your converted ONNX model

You can check the operator set of your converted ONNX model using Netron, a viewer for Neural Network models. Alternatively, you could identify your converted model's opset version through the following line of code.

opset_version = onnx_model.opset_import[0].version

If the result from checking your ONNX model's opset is smaller than the target_opset number you specified in the onnxmltools.convert function, do not be alarmed. The ONNXMLTools converter works by converting each operator to the ONNX format individually and finding the corresponding opset version that it was most recently updated in. Once all of the operators are converted, the resultant ONNX model has the maximal opset version of all of its operators.

To illustrate this concretely, let's consider a model with two operators, Abs and Add. As of December 2018, Abs was most recently updated in opset 6, and Add was most recently updated in opset 7. Therefore, the converted ONNX model's opset will always be 7, even if you request target_opset=8. The converter behavior was defined this way to ensure backwards compatibility.

Documentation for the ONNX Model format and more examples for converting models from different frameworks can be found in the ONNX tutorials repository.

Test all existing converters

There exists a way to automatically check every converter with onnxruntime or onnxruntime-gpu. This process requires the user to clone the onnxmltools repository. The following command runs all unit tests and generates dumps of models, inputs, expected outputs and converted models in folder TESTDUMP.

python tests/main.py DUMP

It requires onnxruntime, numpy for most models, pandas for transforms related to text features, and scipy for sparse features. One test also requires keras to test a custom operator. That means sklearn or any machine learning library is requested.

Add a new converter

Once the converter is implemented, a unit test is added to confirm that it works. At the end of the unit test, function dump_data_and_model or any equivalent function must be called to dump the expected output and the converted model. Once these file are generated, a corresponding test must be added in tests_backend to compute the prediction with the runtime.

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

onnxmltools-1.4.0-py2.py3-none-any.whl (326.1 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file onnxmltools-1.4.0-py2.py3-none-any.whl.

File metadata

  • Download URL: onnxmltools-1.4.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 326.1 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.18.4 setuptools/39.1.0 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.6.5

File hashes

Hashes for onnxmltools-1.4.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 5bc1d272ed895a667917150824743fb32c3a367f061beac83a51153e1834fa7e
MD5 c983872fa16c4b2a10949a5b910688dc
BLAKE2b-256 6f0d83201824e7693b63eac297e294cfa8af69f70b1b56492d40d580ba44bb6a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page