Skip to main content

An edge inference library on embedded device with EPU designed by iluvatar.ai.

Project description

An edge inference library on embedded device that contains Edge EPU coprocessor. It’s ideal for prototyping new projects that demand fast on-device inferencing for machine learning models. tflex library provides three command line tools: tflexconverter is provided to convert .pb/.h5 model to .tflex model directly supported on EPU, tflexviewer is supplied to display the network architecture(.pb and .tflex file are both supported) more intuitively based on advanced tensorboard, and tflexverify is used to validate whether the model was converted successfully. That is, when a pre-trained or custom model are prepared, then you can use tflexconverter command to convert the model to EPU format, and deploy the model in your device for inference.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Built Distribution

tflex-1.0.0rc2-py2.py3-none-any.whl (51.2 kB view hashes)

Uploaded py2 py3

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Huawei Huawei PSF Sponsor Microsoft Microsoft PSF Sponsor NVIDIA NVIDIA PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page