utils for working with onnx models
Project description
onnxmodel-utils
utils for working with onnx models
Example
Simple if model
from onnxmodel_utils import Model, build_if_model
model1 = Model.load('model1.onnx')
model2 = Model.load('model2.onnx')
model = build_if_model(
"if_model",
"cond",
model1,
model2,
)
model.save('if_model.onnx')
import onnxruntime
sess = onnxruntime.InferenceSession('if_model.onnx')
inps = {
"input": np.random.randn(1, 3, 224, 224).astype(np.float32),
"cond": np.array([True]).astype(np.bool),
}
out1 = sess.run(None, inps)
inps["cond"] = np.array([False]).astype(np.bool)
out2 = sess.run(None, inps)
Optional cache model
from onnxmodel_utils import Model, build_if_model_with_cache
decoder = Model.load("decoder.onnx")
decoder_init = Model.load("decoder_init.onnx")
model = build_if_model_with_cache(
name="merged_model",
cache_model=decoder,
cacheless_model=decoder_init,
cache_names=["pasts", "pasts_st"],
)
model.save("merged_model.onnx")
import onnxruntime
import numpy as np
sess = onnxruntime.InferenceSession("merged_model.onnx")
inps = {
"input_ids": np.array([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]], dtype=np.int64),
"target_ids": np.array([[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]], dtype=np.int64),
"pasts": None,
"pasts_st": None,
}
init_out = sess.run(None, inps)
inps["pasts"] = init_out[1]
inps["pasts_st"] = init_out[2]
out = sess.run(None, inps)
Installation
pip install onnxmodel-utils
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
onnxmodel-utils-0.0.4.tar.gz
(25.2 kB
view hashes)
Built Distribution
Close
Hashes for onnxmodel_utils-0.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4a55b456be20a1e81fe3a3b9a7e214410a9450dd8e6f79bcb18f58ee8a308cb8 |
|
MD5 | 81327622d513c9a283a050c263ec62ce |
|
BLAKE2b-256 | 4c3d38b8011730bb2f79505a01ae0339dad0947e8167e0051ad3a74225786396 |