Skip to main content

Converts Machine Learning models to ONNX

Project description

Linux Windows
Build Status Build status

Introduction

ONNXMLTools enables you to convert models from different machine learning toolkits into ONNX. Currently the following toolkits are supported:

  • Apple Core ML
  • scikit-learn (subset of models convertible to ONNX)
  • Keras
  • LightGBM (through its scikit-learn interface)

(To convert Tensorflow models to ONNX, see tensorflow-onnx) (To convert ONNX model to Core ML, see onnx-coreml)
If you want the converted model is compatible with certain ONNX version, please specify the target_opset parameter on invoking convert function, and the following Keras converter example code shows how it works.

Install

You can install latest release of ONNXMLTools from pypi:

pip install onnxmltools

or install from source:

pip install git+https://github.com/onnx/onnxmltools

If you choose to install onnxmltools from its source code, you must set an environment variable ONNX_ML=1 before installing onnx package.

Dependencies

This package uses ONNX, NumPy, and ProtoBuf. If you are converting a model from scikit-learn, Apple Core ML, Keras, or LightGBM, you need the following packages installed respectively:

  1. scikit-learn
  2. CoreMLTools
  3. Keras (version 2.0.8 or higher) with corresponding Tensorflow version
  4. LightGBM (scikit-learn interface)

Examples

Here is a simple example to convert a Core ML model:

import onnxmltools
import coremltools

# Load a Core ML model
coreml_model = coremltools.utils.load_spec('example.mlmodel')

# Convert the Core ML model into ONNX
onnx_model = onnxmltools.convert_coreml(coreml_model, 'Example Model')

# Save as text
onnxmltools.utils.save_text(onnx_model, 'example.json')

# Save as protobuf
onnxmltools.utils.save_model(onnx_model, 'example.onnx')

Next, we show a simple usage of the Keras converter.

import onnxmltools
from keras.layers import Input, Dense, Add
from keras.models import Model

# N: batch size, C: sub-model input dimension, D: final model's input dimension
N, C, D = 2, 3, 3

# Define a sub-model, it will become a part of our final model
sub_input1 = Input(shape=(C,))
sub_mapped1 = Dense(D)(sub_input1)
sub_model1 = Model(inputs=sub_input1, outputs=sub_mapped1)

# Define another sub-model, it will become a part of our final model
sub_input2 = Input(shape=(C,))
sub_mapped2 = Dense(D)(sub_input2)
sub_model2 = Model(inputs=sub_input2, outputs=sub_mapped2)

# Define a model built upon the previous two sub-models
input1 = Input(shape=(D,))
input2 = Input(shape=(D,))
mapped1_2 = sub_model1(input1)
mapped2_2 = sub_model2(input2)
sub_sum = Add()([mapped1_2, mapped2_2])
keras_model = Model(inputs=[input1, input2], output=sub_sum)

# Convert it!
onnx_model = onnxmltools.convert_keras(keras_model, target_opset=8) # target_opset is optional

Tests converted models

onnxmltools converts models in ONNX format which can be then used to compute predictions with the backend of your choice. However, there exists a way to automatically check every converter with onnxruntime or onnxruntime-gpu.

Test all existing converters

This process requires to clone the onnxmltools repository. The following command runs all unit tests and generates dumps of models, inputs, expected outputs and converted models in folder TESTDUMP.

python tests/main.py DUMP

It requires onnxruntime, numpy for most of the models, pandas for transform related to text features, scipy for sparse features. One test also requires keras to test a custom operator. That means sklearn or any machine learning library is requested.

Add a new converter

Once the converter is implemented, a unit test is added to test it works. At the end of the unit test, function dump_data_and_model or any equivalent function must be called to dump the expected output and the converted model. Once these file are generated, a corresponding test must be added in tests_backend to compute the prediction with the runtime.

License

MIT License

Acknowledgments

The package was developed by the following engineers and data scientists at Microsoft starting from winter 2017: Zeeshan Ahmed, Wei-Sheng Chin, Aidan Crook, Xavier Dupre, Costin Eseanu, Tom Finley, Lixin Gong, Scott Inglis, Pei Jiang, Ivan Matantsev, Prabhat Roy, M. Zeeshan Siddiqui, Shouheng Yi, Shauheen Zahirazami, Yiwen Zhu, Du Li, Xuan Li, Wenbing Li

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

onnxmltools-1.3.1-py2.py3-none-any.whl (278.4 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file onnxmltools-1.3.1-py2.py3-none-any.whl.

File metadata

  • Download URL: onnxmltools-1.3.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 278.4 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.18.4 setuptools/39.1.0 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.6.5

File hashes

Hashes for onnxmltools-1.3.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 211a769fcc76aee96c580a43b64b0dd6afcd73be11d64ef6acf9b4b1f9bb8657
MD5 6e8c65837b2bcb3358fdd605d89fa9ec
BLAKE2b-256 9ae6083b6fe47400f49f02e1909b8327c955e0a3a2f64477bb524528a73960f6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page