Skip to main content

A very simple tool that compresses the overall size of the ONNX model by aggregating duplicate constant values as much as possible. Simple Constant value Shrink for ONNX.

Project description

scs4onnx

A very simple tool that compresses the overall size of the ONNX model by aggregating duplicate constant values as much as possible. Simple Constant value Shrink for ONNX.

Downloads GitHub PyPI CodeQL

Key concept

  • If the same constant tensor is found by scanning the entire graph for Constant values, it is aggregated into a single constant tensor.
  • Ignore scalar values.
  • Ignore variables.
  • Finally, create a Fork of onnx-simplifier and merge this process just before the onnx file output process

1. Setup

1-1. HostPC

### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc

### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install -U scs4onnx

1-2. Docker

### docker pull
$ docker pull pinto0309/scs4onnx:latest

### docker build
$ docker build -t pinto0309/scs4onnx:latest .

### docker run
$ docker run --rm -it -v `pwd`:/workdir pinto0309/scs4onnx:latest
$ cd /workdir

2. CLI Usage

$ scs4onnx -h

usage:
scs4onnx [-h] [--mode {shrink,npy}] [--non_verbose] input_onnx_file_path output_onnx_file_path

positional arguments:
  input_onnx_file_path
                        Input onnx file path.
  output_onnx_file_path
                        Output onnx file path.

optional arguments:
  -h, --help            show this help message and exit
  --mode {shrink,npy}   Constant Value Compression Mode.
                        shrink: Share constant values inside the model as much as possible.
                                The model size is slightly larger because
                                some shared constant values remain inside the model,
                                but performance is maximized.
                        npy:    Outputs constant values used repeatedly in the model to an
                                external file .npy. Instead of the smallest model body size,
                                the file loading overhead is greater.
                        Default: shrink
  --non_verbose         Do not show all information logs. Only error logs are displayed.

3. In-script Usage

$ python
>>> from scs4onnx import shrinking
>>> help(shrinking)

Help on function shrinking in module scs4onnx.onnx_shrink_constant:

shrinking(
  input_onnx_file_path: Union[str, NoneType] = '',
  output_onnx_file_path: Union[str, NoneType] = '',
  onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,
  mode: Union[str, NoneType] = 'shrink',
  non_verbose: Union[bool, NoneType] = False
) -> Tuple[onnx.onnx_ml_pb2.ModelProto, str]

    Parameters
    ----------
    input_onnx_file_path: Optional[str]
        Input onnx file path.
        Either input_onnx_file_path or onnx_graph must be specified.

    output_onnx_file_path: Optional[str]
        Outpu onnx file path.
        If output_onnx_file_path is not specified, no .onnx file is output.

    onnx_graph: Optional[onnx.ModelProto]
        onnx.ModelProto.
        Either input_onnx_file_path or onnx_graph must be specified.
        onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.

    mode: Optional[str]
        Constant Value Compression Mode.
        'shrink': Share constant values inside the model as much as possible.
            The model size is slightly larger because some shared constant values remain
            inside the model, but performance is maximized.
        'npy': Outputs constant values used repeatedly in the model to an external file .npy.
            Instead of the smallest model body size, the file loading overhead is greater.
        Default: shrink

    non_verbose: Optional[bool]
        Do not show all information logs. Only error logs are displayed.
        Default: False

    Returns
    -------
    shrunken_graph: onnx.ModelProto
        Shrunken onnx ModelProto

    npy_file_paths: List[str]
        List of paths to externally output .npy files.
        An empty list is always returned when in 'shrink' mode.

3. CLI Execution

$ scs4onnx input.onnx output.onnx --mode shrink

image

4. In-script Execution

4-1. When an onnx file is used as input

If output_onnx_file_path is not specified, no .onnx file is output.

from scs4onnx import shrinking

shrunk_graph, npy_file_paths = shrinking(
  input_onnx_file_path='input.onnx',
  output_onnx_file_path='output.onnx',
  mode='npy',
  non_verbose=False
)

image

4-2. When entering the onnx.ModelProto

onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.

from scs4onnx import shrinking

shrunk_graph, npy_file_paths = shrinking(
  onnx_graph=graph,
  mode='npy',
  non_verbose=True
)

5. Sample

5-1. shrink mode sample

  • 297.8MB -> 67.4MB

    image image

5-2. npy mode sample

  • 297.8MB -> 21.3MB

    image image

5-3. .npy file view

$ python
>>> import numpy as np
>>> param = np.load('gmflow_sintel_480x640_shrunken_exported_1646.npy')
>>> param.shape
(8, 1200, 1200)
>>> param
array([[[   0.,    0.,    0., ...,    0.,    0.,    0.],
        [   0.,    0.,    0., ...,    0.,    0.,    0.],
        [   0.,    0.,    0., ...,    0.,    0.,    0.],
        ...,
        [-100., -100., -100., ...,    0.,    0.,    0.],
        [-100., -100., -100., ...,    0.,    0.,    0.],
        [-100., -100., -100., ...,    0.,    0.,    0.]]], dtype=float32)

6. Reference

  1. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html
  2. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scs4onnx-1.0.7.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

scs4onnx-1.0.7-py3-none-any.whl (7.8 kB view details)

Uploaded Python 3

File details

Details for the file scs4onnx-1.0.7.tar.gz.

File metadata

  • Download URL: scs4onnx-1.0.7.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.10.4

File hashes

Hashes for scs4onnx-1.0.7.tar.gz
Algorithm Hash digest
SHA256 4f9fd46739c44cfd8b282ce68cb5d82e115da2a47256914888c0be5c813d197b
MD5 dd2d6bcd7e0377e4ac8abb8f0083a249
BLAKE2b-256 ba987e52fed111e6ecd3d7a0298afad64c227da06baef3403cea8a737f8df95b

See more details on using hashes here.

File details

Details for the file scs4onnx-1.0.7-py3-none-any.whl.

File metadata

  • Download URL: scs4onnx-1.0.7-py3-none-any.whl
  • Upload date:
  • Size: 7.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.10.4

File hashes

Hashes for scs4onnx-1.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 ee12f36be862c36d121787aacfa4aff8c3df1764846e94fe8f063bd8e3974356
MD5 cbe7aed3ab4548b1609b048d29ba6113
BLAKE2b-256 03a9bbf43b445f4666fa9de5495de73d457b5d70ee1e98544acc0295d3776dca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page