Skip to main content

ONNX Runtime is a runtime accelerator for Machine Learning models

Project description

OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
  • Intel® CPUs

  • Intel® integrated GPUs

Installation

Requirements

  • Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit

  • Python 3.7, 3.8 or 3.9 for Linux and only Python3.9 for Windows

This package supports:
  • Intel® CPUs

  • Intel® integrated GPUs

pip3 install onnxruntime-openvino

Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.

OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2022.2.0 eliminating the need to install OpenVINO™ separately. The OpenVINO™ libraries are prebuilt with CXX11_ABI flag set to 0.

Usage

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated GPU for AI inferencing. Invoke the provider config device type argument to change the hardware on which inferencing is done.

For more API calls and environment variables, see Usage.

Samples

To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.

License

OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Support

Please submit your questions, feature requests and bug reports via GitHub Issues.

How to Contribute

We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_openvino-1.14.0-cp39-cp39-win_amd64.whl (4.6 MB view hashes)

Uploaded CPython 3.9 Windows x86-64

onnxruntime_openvino-1.14.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (36.6 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.14.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (36.5 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.14.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (36.5 MB view hashes)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page