Skip to main content

utils for working with onnx models

Project description

onnxmodel-utils

utils for working with onnx models

Example

Simple if model

from onnxmodel_utils import Model, build_if_model


model1 = Model.load('model1.onnx')
model2 = Model.load('model2.onnx')

model = build_if_model(
    "if_model",
    "cond",
    model1,
    model2,
)
model.save('if_model.onnx')


import onnxruntime

sess = onnxruntime.InferenceSession('if_model.onnx')
inps = {
    "input": np.random.randn(1, 3, 224, 224).astype(np.float32),
    "cond": np.array([True]).astype(np.bool),
}
out1 = sess.run(None, inps)

inps["cond"] = np.array([False]).astype(np.bool)
out2 = sess.run(None, inps)

Optional cache model

from onnxmodel_utils import Model, build_if_model_with_cache


decoder = Model.load("decoder.onnx")
decoder_init = Model.load("decoder_init.onnx")

model = build_if_model_with_cache(
    name="merged_model",
    cache_model=decoder,
    cacheless_model=decoder_init,
    cache_names=["pasts", "pasts_st"],
)
model.save("merged_model.onnx")


import onnxruntime
import numpy as np

sess = onnxruntime.InferenceSession("merged_model.onnx")
inps = {
    "input_ids": np.array([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]], dtype=np.int64),
    "target_ids": np.array([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]], dtype=np.int64),
    "pasts": None,
    "pasts_st": None,
}

init_out = sess.run(None, inps)
inps["pasts"] = init_out[1]
inps["pasts_st"] = init_out[2]

out = sess.run(None, inps)

Installation

pip install onnxmodel-utils

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

onnxmodel-utils-0.0.12.tar.gz (25.8 kB view details)

Uploaded Source

Built Distribution

onnxmodel_utils-0.0.12-py3-none-any.whl (26.4 kB view details)

Uploaded Python 3

File details

Details for the file onnxmodel-utils-0.0.12.tar.gz.

File metadata

  • Download URL: onnxmodel-utils-0.0.12.tar.gz
  • Upload date:
  • Size: 25.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for onnxmodel-utils-0.0.12.tar.gz
Algorithm Hash digest
SHA256 ca3b448cbe500c58352aec3374a686e5da0b3554fe2be502e149187fc263d9c7
MD5 74473bdbfa95ccff582a93c73a669146
BLAKE2b-256 93b0bc90bc2c33ae4d22ea7b7dbc9eca3c0d682a05681e6519bc4058981a96e7

See more details on using hashes here.

File details

Details for the file onnxmodel_utils-0.0.12-py3-none-any.whl.

File metadata

File hashes

Hashes for onnxmodel_utils-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 57ece08f5c0119d91f69d0938dab4bcfe8b84bb4a0f706967971b25fdf3b1304
MD5 f61e1a1606b09e1f61993d79864eee9e
BLAKE2b-256 19cea73f38b48484c2354280b791dc742a3829865830793f1e2e90e22fdbfbad

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page