utils for working with onnx models
Project description
onnxmodel-utils
utils for working with onnx models
Example
Simple if model
from onnxmodel_utils import Model, build_if_model
model1 = Model.load('model1.onnx')
model2 = Model.load('model2.onnx')
model = build_if_model(
"if_model",
"cond",
model1,
model2,
)
model.save('if_model.onnx')
import onnxruntime
sess = onnxruntime.InferenceSession('if_model.onnx')
inps = {
"input": np.random.randn(1, 3, 224, 224).astype(np.float32),
"cond": np.array([True]).astype(np.bool),
}
out1 = sess.run(None, inps)
inps["cond"] = np.array([False]).astype(np.bool)
out2 = sess.run(None, inps)
Optional cache model
from onnxmodel_utils import Model, build_if_model_with_cache
decoder = Model.load("decoder.onnx")
decoder_init = Model.load("decoder_init.onnx")
model = build_if_model_with_cache(
name="merged_model",
cache_model=decoder,
cacheless_model=decoder_init,
cache_names=["pasts", "pasts_st"],
)
model.save("merged_model.onnx")
import onnxruntime
import numpy as np
sess = onnxruntime.InferenceSession("merged_model.onnx")
inps = {
"input_ids": np.array([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]], dtype=np.int64),
"target_ids": np.array([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]], dtype=np.int64),
"pasts": None,
"pasts_st": None,
}
init_out = sess.run(None, inps)
inps["pasts"] = init_out[1]
inps["pasts_st"] = init_out[2]
out = sess.run(None, inps)
Installation
pip install onnxmodel-utils
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
onnxmodel-utils-0.0.9.tar.gz
(25.7 kB
view hashes)
Built Distribution
Close
Hashes for onnxmodel_utils-0.0.9-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | e5a53e46da35ec5a8ba248a0cf9db4d5674083d5e291f5a0fdba42e87b18b7a3 |
|
MD5 | cc252f1bcb289302f2632f1f26db2ccf |
|
BLAKE2b-256 | 5fab258de721471371694b60439bd5f9cb8a9744a33466befc3f4a0f5b12aa2e |