Skip to main content

ONNX Runtime is a runtime accelerator for Machine Learning models

Project description

OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

  • Intel® integrated NPUs (Windows only)

Installation

Requirements

  • Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit

  • Python 3.9 or 3.10 or 3.11 for Linux and Python 3.10, 3.11 for Windows

This package supports:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

  • Intel® integrated NPUs (Windows only)

pip3 install onnxruntime-openvino

Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.

OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2024.1.0 eliminating the need to install OpenVINO™ separately.

For more details on build and installation please refer to Build.

Usage

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated GPU, discrete GPU, integrated NPU (Windows only). Invoke the provider config device type argument to change the hardware on which inferencing is done.

For more API calls and environment variables, see Usage.

Samples

To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.

License

OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Support

Please submit your questions, feature requests and bug reports via GitHub Issues.

How to Contribute

We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_openvino-1.19.0-cp311-cp311-win_amd64.whl (12.1 MB view details)

Uploaded CPython 3.11 Windows x86-64

onnxruntime_openvino-1.19.0-cp311-cp311-manylinux_2_28_x86_64.whl (52.1 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.28+ x86-64

onnxruntime_openvino-1.19.0-cp310-cp310-win_amd64.whl (12.1 MB view details)

Uploaded CPython 3.10 Windows x86-64

onnxruntime_openvino-1.19.0-cp310-cp310-manylinux_2_28_x86_64.whl (52.1 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.28+ x86-64

onnxruntime_openvino-1.19.0-cp39-cp39-manylinux_2_28_x86_64.whl (52.1 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.28+ x86-64

File details

Details for the file onnxruntime_openvino-1.19.0-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.19.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 12330922ecdb694ea28dbdcf08c172e47a5a84fee603040691341336ee3e42bc
MD5 ff29f0fcc07d3d66d6aba121698210f1
BLAKE2b-256 e8bbb8fa939119164eecef8ac38a08b6f7c7832158e3bec056f32866e5c68c37

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.19.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.19.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f3a0b954026286421b3a769c746c403e8f141f3887d1dd601beb7c4dbf77488a
MD5 0f95adbe566f7930b5a7a1b72162feb1
BLAKE2b-256 02251c1d982ef2c07f59b972ff0764fb7ac6fc89ae3d9b2936697d1b762b7db1

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.19.0-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.19.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 fb8de2a60cf78db6e201b0a489479995d166938e9c53b01ff342dc7f5f8251ff
MD5 d49bd61a06d951eff5d87a44fd79be27
BLAKE2b-256 62591bedc810bdd0ba887d0483751cb407ce07da2050bbe56c8d7b4bda433794

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.19.0-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.19.0-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 8c5658da819b26d9f35f95204e1bdfb74a100a7533e74edab3af6316c1e316e8
MD5 761e0704210b615bbde7132ed668380f
BLAKE2b-256 a2ec414da1603574065cae9668c47ef5453966a1d3c4237ef101e36e89b50355

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.19.0-cp39-cp39-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.19.0-cp39-cp39-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 be00502b1a46ba1891cbe49049033745f71c0b99df6d24b979f5b4084b9567d0
MD5 01926fa6540def37ec0861e8154e8bb1
BLAKE2b-256 20373f66154285d67325737b0f0d22535752bdd98091614e9592455b73d00aa3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page