Skip to main content

Inference module.

Project description

How to use

Use directly

  • model_file_path: Model file path(ONNX, TFLite)
  • dataset_file_path: The path to the compressed file or the individual npy files where the delimiter of the input layer is used as the file name.
python3 inference.py --model_file_path tests/your_model_file.tflite --dataset_file_path tests/your_dataset_file.npy 

Import and use

from netspresso_inference_package.inference.inference_service import InferenceService
inf_service = InferenceService(
        model_file_path="/app/tests/people_detection.onnx",
        dataset_file_path="/app/tests/dataset_for_onnx.npy"
        )
inf_service.run()
print(inf_service.result_file_path)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

File details

Details for the file netspresso_inference_package-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for netspresso_inference_package-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 54d2a1baeb8983cbf35da2ffc37b98b5bb525b67c187d2e8b01a4ff42593456a
MD5 c7c49ec99d6393b59c31503dfc1905e8
BLAKE2b-256 799d5502aa9dbf740f0f266036eb2bcbcc91e4caa4803e251bc05d34c535b971

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page