Skip to main content

ONNX Runtime is a runtime accelerator for Machine Learning models

Project description

OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

Installation

Requirements

  • Ubuntu 20.06, or Windows 10 - 64 bit

  • Python 3.9,3.10,3,11 for Linux and Python3.10 and Python 3.11 for Windows

  • OpenVINO 2023.3.0 Version only

This package supports:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

pip3 install onnxruntime-openvino

Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.

OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2023.3.0 eliminating the need to install OpenVINO™ separately. The OpenVINO™ libraries are prebuilt with CXX11_ABI flag set to 0.

For more details on build and installation please refer to Build.

Usage

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated or discrete GPU. Invoke the provider config device type argument to change the hardware on which inferencing is done.

For more API calls and environment variables, see Usage.

Samples

To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.

License

OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Support

Please submit your questions, feature requests and bug reports via GitHub Issues.

How to Contribute

We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_openvino-1.17.1-cp311-cp311-win_amd64.whl (6.7 MB view hashes)

Uploaded CPython 3.11 Windows x86-64

onnxruntime_openvino-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (53.2 MB view hashes)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.17.1-cp310-cp310-win_amd64.whl (6.7 MB view hashes)

Uploaded CPython 3.10 Windows x86-64

onnxruntime_openvino-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (53.2 MB view hashes)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

onnxruntime_openvino-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (53.2 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page