utils for working with onnx models
Project description
onnxmodel-utils
utils for working with onnx models
Example
Simple if model
from onnxmodel_utils import Model, build_if_model
model1 = Model.load('model1.onnx')
model2 = Model.load('model2.onnx')
model = build_if_model(
"if_model",
"cond",
model1,
model2,
)
model.save('if_model.onnx')
import onnxruntime
sess = onnxruntime.InferenceSession('if_model.onnx')
inps = {
"input": np.random.randn(1, 3, 224, 224).astype(np.float32),
"cond": np.array([True]).astype(np.bool),
}
out1 = sess.run(None, inps)
inps["cond"] = np.array([False]).astype(np.bool)
out2 = sess.run(None, inps)
Optional cache model
from onnxmodel_utils import Model, build_if_model_with_cache
decoder = Model.load("decoder.onnx")
decoder_init = Model.load("decoder_init.onnx")
model = build_if_model_with_cache(
name="merged_model",
cache_model=decoder,
cacheless_model=decoder_init,
cache_names=["pasts", "pasts_st"],
)
model.save("merged_model.onnx")
import onnxruntime
import numpy as np
sess = onnxruntime.InferenceSession("merged_model.onnx")
inps = {
"input_ids": np.array([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]], dtype=np.int64),
"target_ids": np.array([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]], dtype=np.int64),
"pasts": None,
"pasts_st": None,
}
init_out = sess.run(None, inps)
inps["pasts"] = init_out[1]
inps["pasts_st"] = init_out[2]
out = sess.run(None, inps)
Installation
pip install onnxmodel-utils
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
onnxmodel-utils-0.0.11.tar.gz
(25.8 kB
view hashes)
Built Distribution
Close
Hashes for onnxmodel_utils-0.0.11-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | aa6f2346e8e5917aef9877843a601c82654aa14b370a7aad2cfdd1ad1135b760 |
|
MD5 | 5480249192ae7a23f82cc684b8c948d8 |
|
BLAKE2b-256 | 2d2c7e3b48f6c9254cca7a12e733da3403dcde6ae6786794981d499672d4d420 |