Skip to main content

An edge inference library on embedded device with EPU designed by iluvatar.ai.

Project description

An edge inference library on embedded device that contains Edge EPU coprocessor. It’s ideal for prototyping new projects that demand fast on-device inferencing for machine learning models. tflex library provides three command line tools: tflexconverter is provided to convert .pb/.h5 model to .tflex model directly supported on EPU, tflexviewer is supplied to display the network architecture(.pb and .tflex file are both supported) more intuitively based on advanced tensorboard, and tflexverify is used to validate whether the model was converted successfully. That is, when a pre-trained or custom model are prepared, then you can use tflexconverter command to convert the model to EPU format, and deploy the model in your device for inference.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

tflex-1.0.0rc2-py2.py3-none-any.whl (51.2 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page