Skip to main content

Open Neural Network Exchange

Project description

PyPI - Version CI CII Best Practices OpenSSF Scorecard REUSE compliant Ruff Black

Open Neural Network Exchange (ONNX) is an open ecosystem that empowers AI developers to choose the right tools as their project evolves. ONNX provides an open source format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types. Currently we focus on the capabilities needed for inferencing (scoring).

ONNX is widely supported and can be found in many frameworks, tools, and hardware. Enabling interoperability between different frameworks and streamlining the path from research to production helps increase the speed of innovation in the AI community. We invite the community to join us and further evolve ONNX.

Use ONNX

Learn about the ONNX spec

Programming utilities for working with ONNX Graphs

Contribute

ONNX is a community project and the open governance model is described here. We encourage you to join the effort and contribute feedback, ideas, and code. You can participate in the Special Interest Groups and Working Groups to shape the future of ONNX.

Check out our contribution guide to get started.

If you think some operator should be added to ONNX specification, please read this document.

Community meetings

The schedules of the regular meetings of the Steering Committee, the working groups and the SIGs can be found here

Community Meetups are held at least once a year. Content from previous community meetups are at:

Discuss

We encourage you to open Issues, or use Slack (If you have not joined yet, please use this link to join the group) for more real-time discussion.

Follow Us

Stay up to date with the latest ONNX news. [Facebook] [Twitter]

Roadmap

A roadmap process takes place every year. More details can be found here

Installation

Official Python packages

ONNX released packages are published in PyPi.

pip install onnx  # or pip install onnx[reference] for optional reference implementation dependencies

AMD's ONNX weekly packages are published in PyPI to enable experimentation and early testing.

vcpkg packages

onnx is in the maintenance list of vcpkg, you can easily use vcpkg to build and install it.

git clone https://github.com/microsoft/vcpkg.git
cd vcpkg
./bootstrap-vcpkg.bat # For powershell
./bootstrap-vcpkg.sh # For bash
./vcpkg install onnx

Conda packages

A binary build of ONNX is available from Conda, in conda-forge:

conda install -c conda-forge onnx

Build ONNX from Source

Before building from source uninstall any existing versions of onnx pip uninstall onnx.

c++17 or higher C++ compiler version is required to build ONNX from source. Still, users can specify their own CMAKE_CXX_STANDARD version for building ONNX.

If you don't have protobuf installed, ONNX will internally download and build protobuf for ONNX build.

Or, you can manually install protobuf C/C++ libraries and tools with specified version before proceeding forward. Then depending on how you installed protobuf, you need to set environment variable CMAKE_ARGS to "-DONNX_USE_PROTOBUF_SHARED_LIBS=ON" or "-DONNX_USE_PROTOBUF_SHARED_LIBS=OFF". For example, you may need to run the following command:

Linux:

export CMAKE_ARGS="-DONNX_USE_PROTOBUF_SHARED_LIBS=ON"

Windows:

set CMAKE_ARGS="-DONNX_USE_PROTOBUF_SHARED_LIBS=ON"

The ON/OFF depends on what kind of protobuf library you have. Shared libraries are files ending with *.dll/*.so/*.dylib. Static libraries are files ending with *.a/*.lib. This option depends on how you get your protobuf library and how it was built. And it is default OFF. You don't need to run the commands above if you'd prefer to use a static protobuf library.

Windows

If you are building ONNX from source, it is recommended that you also build Protobuf locally as a static library. The version distributed with conda-forge is a DLL, but ONNX expects it to be a static library. Building protobuf locally also lets you control the version of protobuf. The tested and recommended version is 3.21.12.

The instructions in this README assume you are using Visual Studio. It is recommended that you run all the commands from a shell started from "x64 Native Tools Command Prompt for VS 2019" and keep the build system generator for cmake (e.g., cmake -G "Visual Studio 16 2019") consistent while building protobuf as well as ONNX.

You can get protobuf by running the following commands:

git clone https://github.com/protocolbuffers/protobuf.git
cd protobuf
git checkout v21.12
cd cmake
cmake -G "Visual Studio 16 2019" -A x64 -DCMAKE_INSTALL_PREFIX=<protobuf_install_dir> -Dprotobuf_MSVC_STATIC_RUNTIME=OFF -Dprotobuf_BUILD_SHARED_LIBS=OFF -Dprotobuf_BUILD_TESTS=OFF -Dprotobuf_BUILD_EXAMPLES=OFF .
msbuild protobuf.sln /m /p:Configuration=Release
msbuild INSTALL.vcxproj /p:Configuration=Release

Then it will be built as a static library and installed to <protobuf_install_dir>. Please add the bin directory(which contains protoc.exe) to your PATH.

set CMAKE_PREFIX_PATH=<protobuf_install_dir>;%CMAKE_PREFIX_PATH%

Please note: if your protobuf_install_dir contains spaces, do not add quotation marks around it.

Alternative: if you don't want to change your PATH, you can set ONNX_PROTOC_EXECUTABLE instead.

set CMAKE_ARGS=-DONNX_PROTOC_EXECUTABLE=<full_path_to_protoc.exe>

Then you can build ONNX as:

git clone https://github.com/onnx/onnx.git
cd onnx
git submodule update --init --recursive
# prefer lite proto
set CMAKE_ARGS=-DONNX_USE_LITE_PROTO=ON
pip install -e . -v

Linux

First, you need to install protobuf. The minimum Protobuf compiler (protoc) version required by ONNX is 3.6.1. Please note that old protoc versions might not work with CMAKE_ARGS=-DONNX_USE_LITE_PROTO=ON.

Ubuntu 20.04 (and newer) users may choose to install protobuf via

apt-get install python3-pip python3-dev libprotobuf-dev protobuf-compiler

In this case, it is required to add -DONNX_USE_PROTOBUF_SHARED_LIBS=ON to CMAKE_ARGS in the ONNX build step.

A more general way is to build and install it from source. See the instructions below for more details.

Installing Protobuf from source

Debian/Ubuntu:

  git clone https://github.com/protocolbuffers/protobuf.git
  cd protobuf
  git checkout v21.12
  git submodule update --init --recursive
  mkdir build_source && cd build_source
  cmake ../cmake -Dprotobuf_BUILD_SHARED_LIBS=OFF -DCMAKE_INSTALL_PREFIX=/usr -DCMAKE_INSTALL_SYSCONFDIR=/etc -DCMAKE_POSITION_INDEPENDENT_CODE=ON -Dprotobuf_BUILD_TESTS=OFF -DCMAKE_BUILD_TYPE=Release
  make -j$(nproc)
  make install

CentOS/RHEL/Fedora:

  git clone https://github.com/protocolbuffers/protobuf.git
  cd protobuf
  git checkout v21.12
  git submodule update --init --recursive
  mkdir build_source && cd build_source
  cmake ../cmake  -DCMAKE_INSTALL_LIBDIR=lib64 -Dprotobuf_BUILD_SHARED_LIBS=OFF -DCMAKE_INSTALL_PREFIX=/usr -DCMAKE_INSTALL_SYSCONFDIR=/etc -DCMAKE_POSITION_INDEPENDENT_CODE=ON -Dprotobuf_BUILD_TESTS=OFF -DCMAKE_BUILD_TYPE=Release
  make -j$(nproc)
  make install

Here "-DCMAKE_POSITION_INDEPENDENT_CODE=ON" is crucial. By default static libraries are built without "-fPIC" flag, they are not position independent code. But shared libraries must be position independent code. Python C/C++ extensions(like ONNX) are shared libraries. So if a static library was not built with "-fPIC", it can't be linked to such a shared library.

Once build is successful, update PATH to include protobuf paths.

Then you can build ONNX as:

git clone https://github.com/onnx/onnx.git
cd onnx
git submodule update --init --recursive
# Optional: prefer lite proto
export CMAKE_ARGS=-DONNX_USE_LITE_PROTO=ON
pip install -e . -v

Mac

export NUM_CORES=`sysctl -n hw.ncpu`
brew update
brew install autoconf && brew install automake
wget https://github.com/protocolbuffers/protobuf/releases/download/v21.12/protobuf-cpp-3.21.12.tar.gz
tar -xvf protobuf-cpp-3.21.12.tar.gz
cd protobuf-3.21.12
mkdir build_source && cd build_source
cmake ../cmake -Dprotobuf_BUILD_SHARED_LIBS=OFF -DCMAKE_POSITION_INDEPENDENT_CODE=ON -Dprotobuf_BUILD_TESTS=OFF -DCMAKE_BUILD_TYPE=Release
make -j${NUM_CORES}
make install

Once build is successful, update PATH to include protobuf paths.

Then you can build ONNX as:

git clone --recursive https://github.com/onnx/onnx.git
cd onnx
# Optional: prefer lite proto
set CMAKE_ARGS=-DONNX_USE_LITE_PROTO=ON
pip install -e . -v

Verify Installation

After installation, run

python -c "import onnx"

to verify it works.

Common Build Options

For full list refer to CMakeLists.txt

Environment variables

  • USE_MSVC_STATIC_RUNTIME should be 1 or 0, not ON or OFF. When set to 1 onnx links statically to runtime library. Default: USE_MSVC_STATIC_RUNTIME=0

  • DEBUG should be 0 or 1. When set to 1 onnx is built in debug mode. or debug versions of the dependencies, you need to open the CMakeLists file and append a letter d at the end of the package name lines. For example, NAMES protobuf-lite would become NAMES protobuf-lited. Default: Debug=0

CMake variables

  • ONNX_USE_PROTOBUF_SHARED_LIBS should be ON or OFF. Default: ONNX_USE_PROTOBUF_SHARED_LIBS=OFF USE_MSVC_STATIC_RUNTIME=0 ONNX_USE_PROTOBUF_SHARED_LIBS determines how onnx links to protobuf libraries.

    • When set to ON - onnx will dynamically link to protobuf shared libs, PROTOBUF_USE_DLLS will be defined as described here, Protobuf_USE_STATIC_LIBS will be set to OFF and USE_MSVC_STATIC_RUNTIME must be 0.
    • When set to OFF - onnx will link statically to protobuf, and Protobuf_USE_STATIC_LIBS will be set to ON (to force the use of the static libraries) and USE_MSVC_STATIC_RUNTIME can be 0 or 1.
  • ONNX_USE_LITE_PROTO should be ON or OFF. When set to ON onnx uses lite protobuf instead of full protobuf. Default: ONNX_USE_LITE_PROTO=OFF

  • ONNX_WERROR should be ON or OFF. When set to ON warnings are treated as errors. Default: ONNX_WERROR=OFF in local builds, ON in CI and release pipelines.

Common Errors

  • Note: the import onnx command does not work from the source checkout directory; in this case you'll see ModuleNotFoundError: No module named 'onnx.onnx_cpp2py_export'. Change into another directory to fix this error.

  • If you run into any issues while building Protobuf as a static library, please ensure that shared Protobuf libraries, like libprotobuf, are not installed on your device or in the conda environment. If these shared libraries exist, either remove them to build Protobuf from source as a static library, or skip the Protobuf build from source to use the shared version directly.

  • If you run into any issues while building ONNX from source, and your error message reads, Could not find pythonXX.lib, ensure that you have consistent Python versions for common commands, such as python and pip. Clean all existing build files and rebuild ONNX again.

Testing

ONNX uses pytest as test driver. In order to run tests, you will first need to install pytest:

pip install pytest nbval

After installing pytest, use the following command to run tests.

pytest

Development

Check out the contributor guide for instructions.

License

Apache License v2.0

Code of Conduct

ONNX Open Source Code of Conduct

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

amd_onnx_weekly-1.18.0.dev20240923.tar.gz (11.4 MB view details)

Uploaded Source

Built Distributions

amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-win_amd64.whl (14.6 MB view details)

Uploaded CPython 3.12 Windows x86-64

amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-win32.whl (14.5 MB view details)

Uploaded CPython 3.12 Windows x86

amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-macosx_12_0_universal2.whl (16.7 MB view details)

Uploaded CPython 3.12 macOS 12.0+ universal2 (ARM64, x86-64)

amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-win_amd64.whl (14.6 MB view details)

Uploaded CPython 3.11 Windows x86-64

amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-win32.whl (14.5 MB view details)

Uploaded CPython 3.11 Windows x86

amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-macosx_12_0_universal2.whl (16.7 MB view details)

Uploaded CPython 3.11 macOS 12.0+ universal2 (ARM64, x86-64)

amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-win_amd64.whl (14.6 MB view details)

Uploaded CPython 3.10 Windows x86-64

amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-win32.whl (14.4 MB view details)

Uploaded CPython 3.10 Windows x86

amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-macosx_12_0_universal2.whl (16.7 MB view details)

Uploaded CPython 3.10 macOS 12.0+ universal2 (ARM64, x86-64)

amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-win_amd64.whl (14.5 MB view details)

Uploaded CPython 3.9 Windows x86-64

amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-win32.whl (14.4 MB view details)

Uploaded CPython 3.9 Windows x86

amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-macosx_12_0_universal2.whl (16.7 MB view details)

Uploaded CPython 3.9 macOS 12.0+ universal2 (ARM64, x86-64)

amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-win_amd64.whl (14.6 MB view details)

Uploaded CPython 3.8 Windows x86-64

amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-win32.whl (14.4 MB view details)

Uploaded CPython 3.8 Windows x86

amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-macosx_12_0_universal2.whl (16.7 MB view details)

Uploaded CPython 3.8 macOS 12.0+ universal2 (ARM64, x86-64)

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923.tar.gz.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923.tar.gz
Algorithm Hash digest
SHA256 cf5403ed9f0b8d53f1f47245ac8ce40f1c15196b2c8844025457fc1c0184eaed
MD5 18b25a54cf7dbe75bd9ce3a5d19bd753
BLAKE2b-256 5431ed7caa7acb79ddba8508710efbaa8664762ea1cac93d306df9b5644aefd4

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 36fd41775b20f6b6ea533252f676506fd60d48fbf0ee2da6c0f939e21a17c367
MD5 eb895f3f69960f0731ec34779554b310
BLAKE2b-256 3e175320c51c673ca8aa72390e9459114ca4c20116facfd251b8b1f6836be582

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-win32.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-win32.whl
Algorithm Hash digest
SHA256 9e28996b6639ed7f41663a6cbc0ca246d5d9b6f01473561348ff8a4a7441ed34
MD5 569feeda2191788cc14fd4e680e1bcbe
BLAKE2b-256 46b67bc014ebb82847897e88b65e4074a78e10a182760139cf88e78100455d34

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-macosx_12_0_universal2.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp312-cp312-macosx_12_0_universal2.whl
Algorithm Hash digest
SHA256 57f70084a495d5cdad67e59d96a3d920b16ca6761e7c23d3173def10300b7afc
MD5 d8b93b89d4c64a577dd69fa51ec67620
BLAKE2b-256 2ef697e3aa839e4d526de82b4fea229903b06821fa218be9b9174de69622c181

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 6a7850adcf4cca28c6e3cf3963e58c7983c253f6ff5c20a4cfd1f7e006971aeb
MD5 b1773c91c997627ae3dde8d2524ff6b1
BLAKE2b-256 1a7550c10b2c393ccc1c10f5dd2a3281d06ac32c2b364f7f6ad1c5f0ff8debb1

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-win32.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-win32.whl
Algorithm Hash digest
SHA256 a1c419ae362ef071df05e4eabd294173c97325f70e14a2eab8b80b714a0ba253
MD5 f6931b76b3ed3031aaf9895f3e13e839
BLAKE2b-256 f6908e519f7927472748d9c887f2039b9e6ca5f519e60ac26b605667e5727bba

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-macosx_12_0_universal2.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp311-cp311-macosx_12_0_universal2.whl
Algorithm Hash digest
SHA256 ededf5071503fc37d781d915c405fb8871cd8b0d0e0db77eb0c370887d626453
MD5 f2b5979ea0122d3f090f26957a816ac4
BLAKE2b-256 79cb55b36d0c82e440ab1a8922592e6f8d4e213381f447774ea5b443dc922ad1

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 ced5166ad4e4be0687947f59907527677393f2aae12de436a950b101a792417b
MD5 5fae5f0c7a0b36611955d62d6eac920a
BLAKE2b-256 6cb47c692832d33a146ca3005487f90a6791167bc074648ed5605ca8411c99d1

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-win32.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-win32.whl
Algorithm Hash digest
SHA256 ca8e3d05dafa25170e2fe98d63c8d9be86f2c2ef1f08b7fa7d3e1f896c274c1b
MD5 c09e74efa8cb3ad0a8c2eb655f3eca90
BLAKE2b-256 8ec52d14a6390752f4f864552558d208e64ea3a1718ef88640aec23bdf0351a4

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-macosx_12_0_universal2.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp310-cp310-macosx_12_0_universal2.whl
Algorithm Hash digest
SHA256 0cb07b402e168df47c310435fd38f32754c762925c50a5f78bca25ed3d6692ce
MD5 e3383d424d37422a50df5bdae2764fd2
BLAKE2b-256 31064f2c259edd7a89f6a1a133631acef2b23c5c5d39702dffe976119832946d

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 0986dca4cc22e815651e6a472ac68130cab8202f26589bd8851226efc210df07
MD5 0b93b86dd98dbeb5d4de30df355764b9
BLAKE2b-256 70dd2c11f4b03110a7ca7db0ce22cc514351593458def179842fc6bd3e34046d

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-win32.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-win32.whl
Algorithm Hash digest
SHA256 d87363cd57fb8c694201b86dabae1aff83a5e7619e8acb03a980b212153f451b
MD5 765f0e52d9bdcb93a43179cdb1393fb4
BLAKE2b-256 eae742aaad9a33ced8513d5e18e9902a416eead5080f92d095befd9b65efb933

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-macosx_12_0_universal2.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp39-cp39-macosx_12_0_universal2.whl
Algorithm Hash digest
SHA256 84e0778294d3289059d1e8b1646a6c97164f11b8779a405037aada993fc7f714
MD5 35d394e54c6aff54a6f5cab99efbb38d
BLAKE2b-256 6baba6984d0a7eb096492c0c433e6b3b4666b58602d109becc0f4c3e8ef83963

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 57538a19e86c12720d05bc9127db1ecec4e97ee251f866e1d189f955ca99b191
MD5 3abaaa2316ab8faef3c65df8fde59485
BLAKE2b-256 ea106ab5101c82752e1789f890c006c967b8442ad6207940687fbc2d0f76c4b6

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-win32.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 5cabd21e558536fc4935d3c86b01e1b2b983386ff5d0ec6d5903e8b3918d91f7
MD5 a58c6671ea291bdc2845b543b0880fac
BLAKE2b-256 aa8efb71aa4ceab4580709fe41ef8b07402d010c88047b575bcf58d283a179d6

See more details on using hashes here.

File details

Details for the file amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-macosx_12_0_universal2.whl.

File metadata

File hashes

Hashes for amd_onnx_weekly-1.18.0.dev20240923-cp38-cp38-macosx_12_0_universal2.whl
Algorithm Hash digest
SHA256 702b2031289a534b15c28ed3d4dea7dc4098de2c12dc37ac4fd933b0606c13c1
MD5 dd6a5046a7146cc8ce6335d5a79a2a0b
BLAKE2b-256 bac7324774b6038c4e042075527e8ed59871604993166899f28345207915d6f6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page