Skip to main content

Unique tool to convert ONNX files (NCHW) to TensorFlow format (NHWC). The purpose of this tool is to solve the massive Transpose extrapolation problem in onnx-tensorflow (onnx-tf).

Project description

[WIP] onnx2tf

Self-Created Tools to convert ONNX files (NCHW) to TensorFlow format (NHWC). The purpose of this tool is to solve the massive Transpose extrapolation problem in onnx-tensorflow (onnx-tf).

Downloads GitHub PyPI CodeQL

Key concept

  • onnx-tensorflow is a very useful tool, but the performance of the generated TensorFlow models is significantly degraded due to the extrapolation of a large number of Transpose OPs before and after each OP during the format conversion from NCHW to NHWC. Therefore, I will make this tool myself as a derivative tool of onnx-tensorflow without extrapolating Transpose.
  • Not only does it handle conversions of 4-dimensional inputs, such as NCHW to NHWC, but also the number of input dimensions in 3, 5, or even more dimensions. For example, NCDHW to NDHWC, etc. However, since 1-D, 2-D, 3-D and 6-D input may produce patterns that are mechanically difficult to convert, it should be possible to give parameters to externally modify the tool's behavior.
  • Immediately following a Reshape OP with dimensional compression and dimensional decompression, there is a 95% probability that the model transformation operation will be disrupted and errors will occur. For example, patterns such as [1,200,200,5] -> [1,200,-1] or [10,20,30,40,50] -> [10,2,10,30,10,4,50].
  • Support conversion to TensorFlow saved model and TFLite (Float32/Float16).
  • Does not support quantization to INT8. For quantization, use the official TensorFlow converter to convert from saved_model to your own.
  • Files exceeding the Protocol Buffers file size limit of 2GB are not supported. Therefore, the external format is not supported at the initial stage of tool creation.
  • If there are ONNX OPs that are not supported by TensorFlow, use simple-onnx-processing-tools to replace them with harmless OPs in advance and then use this tool to convert them. In other words, you can convert any model with your efforts.
  • BatchNormalization supports only inference mode.
  • Only for opset=11 or higher

Project details


Release history Release notifications | RSS feed

This version

0.0.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

onnx2tf-0.0.2.tar.gz (28.7 kB view details)

Uploaded Source

Built Distribution

onnx2tf-0.0.2-py3-none-any.whl (80.6 kB view details)

Uploaded Python 3

File details

Details for the file onnx2tf-0.0.2.tar.gz.

File metadata

  • Download URL: onnx2tf-0.0.2.tar.gz
  • Upload date:
  • Size: 28.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for onnx2tf-0.0.2.tar.gz
Algorithm Hash digest
SHA256 3d8cdf47c529e83ac8d5675bb372f67a7997ded099809cda4906631435564112
MD5 4eab2818256485f52fd38d2aae833b14
BLAKE2b-256 12af1f76c3352fcc4d1f85dc1f24b074d526206d6e7cb56eeebcabeaf6c87fa4

See more details on using hashes here.

File details

Details for the file onnx2tf-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: onnx2tf-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 80.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for onnx2tf-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 015693779ef15a574ea1844fd61ac4bb90573724e20f5a9144e3ccc3d6884e8e
MD5 6227b7ff203594f0b88cb45451064574
BLAKE2b-256 a57c4ff4586a66563925c2f1ca26e44c94ec6f9fa3121deb60fa46ff2e1f4d54

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page