Skip to main content

ONNX Runtime is a runtime accelerator for Machine Learning models

Project description

OpenVINO™ Execution Provider for ONNX Runtime is a product designed for ONNX Runtime developers who want to get started with OpenVINO™ in their inferencing applications. This product delivers OpenVINO™ inline optimizations which enhance inferencing performance with minimal code modifications.

OpenVINO™ Execution Provider for ONNX Runtime accelerates inference across many AI models on a variety of Intel® hardware such as:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

  • Intel® integrated NPUs (Windows only)

Installation

Requirements

  • Ubuntu 18.04, 20.04, RHEL(CPU only) or Windows 10 - 64 bit

  • Python 3.9 or 3.10 or 3.11 for Linux and Python 3.10, 3.11 for Windows

This package supports:
  • Intel® CPUs

  • Intel® integrated GPUs

  • Intel® discrete GPUs

  • Intel® integrated NPUs (Windows only)

pip3 install onnxruntime-openvino

Please install OpenVINO™ PyPi Package separately for Windows. For installation instructions on Windows please refer to OpenVINO™ Execution Provider for ONNX Runtime for Windows.

OpenVINO™ Execution Provider for ONNX Runtime Linux Wheels comes with pre-built libraries of OpenVINO™ version 2024.1.0 eliminating the need to install OpenVINO™ separately.

For more details on build and installation please refer to Build.

Usage

By default, Intel® CPU is used to run inference. However, you can change the default option to either Intel® integrated GPU, discrete GPU, integrated NPU (Windows only). Invoke the provider config device type argument to change the hardware on which inferencing is done.

For more API calls and environment variables, see Usage.

Samples

To see what you can do with OpenVINO™ Execution Provider for ONNX Runtime, explore the demos located in the Examples.

License

OpenVINO™ Execution Provider for ONNX Runtime is licensed under MIT. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Support

Please submit your questions, feature requests and bug reports via GitHub Issues.

How to Contribute

We welcome community contributions to OpenVINO™ Execution Provider for ONNX Runtime. If you have an idea for improvement:

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

onnxruntime_openvino-1.20.0-cp312-cp312-win_amd64.whl (12.3 MB view details)

Uploaded CPython 3.12 Windows x86-64

onnxruntime_openvino-1.20.0-cp312-cp312-manylinux_2_28_x86_64.whl (61.7 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.28+ x86-64

onnxruntime_openvino-1.20.0-cp311-cp311-win_amd64.whl (12.3 MB view details)

Uploaded CPython 3.11 Windows x86-64

onnxruntime_openvino-1.20.0-cp311-cp311-manylinux_2_28_x86_64.whl (61.7 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.28+ x86-64

onnxruntime_openvino-1.20.0-cp310-cp310-manylinux_2_28_x86_64.whl (61.7 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.28+ x86-64

File details

Details for the file onnxruntime_openvino-1.20.0-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.20.0-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 dbd39d1dedf798997393f8fdf8cb89ee4ed905c9a8ea000abdce7c288181b829
MD5 090e86f5b16fbaf5eec2cd536707ffe7
BLAKE2b-256 8b74f29b0967fc71e15e987d1d316aa3193f496985c2b0a7f511615a47c3b973

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.20.0-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.20.0-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 97f424b05feb18b4dbb6e9a85d2bfbd4c928508dc8846622b1c12b4086ce937c
MD5 34521d9aadccfb9ee0e490226a46c2a4
BLAKE2b-256 3d0bd967ac3db4ff8edfd658c27c5e533f2095784a0536f5a98df3a1eb8f7681

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.20.0-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.20.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 d5b3b547e887cbc4081dad940db7d9aef6103dcce30a6746f2042400ad70676f
MD5 27429c790be858d038f699bf5e2d787d
BLAKE2b-256 9adcbd8a9c75e1ca31dc2283c588e4b4235932d5242e6d90bf8c41b67741a4cc

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.20.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.20.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 a5e28a369394b895a0f7048d6ad940f1510a445aa3c89ad4039b57c1a006f68f
MD5 e19c3a1221bd54a876efb231e60e670d
BLAKE2b-256 c6f35be634a82bda2c63c66b50b1d6b5d48a5e51bc8328d40a0a62412e0737a6

See more details on using hashes here.

File details

Details for the file onnxruntime_openvino-1.20.0-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for onnxruntime_openvino-1.20.0-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ae9089466ad3930cced192e8604de161c17fe833b962e511c7133a3b148e6c87
MD5 5506380f56618b16cf107036ef028013
BLAKE2b-256 2f1be0fb933dceb485b4faa6daa2fd971cb9b177d7defc43aa7b41583a371340

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page