Skip to main content

Inference Engine Platform on Edge Device.

Project description

elangai

elangai is an inference engine platform designed for but not limited to the edge device. This SDK includes the main components of elangai.

Installation

Prerequisite

Before installing elangai, users have to install the following components:

  • TensorRT >= 7.0.0.11
  • TensorFlow >= 2.5.0
  • ONNXRuntime >= 1.5.2

Otherwise, elangai won't work as intended.

Install

elangai could be installed by following command.

pip3 install elangai

Quick Guide

Make Your Own AI App

Using the following command on terminal, you could easily generate the AI app.

elangai generate myapp

Dissect Your AI Model

Use the following command to examine the input and output tensor information of your AI model.

elangai model-dissector /path/to/your/model.onnx

Currently supports .onnx and .tflite file only.

Convert Your AI Model to .trt File

elangai only supports .tflite and .trt file inference. Use the following command to convert .onnx model to .trt file.

elangai model-converter --src /path/to/model.onnx  --dst /path/to/model.trt

Currently supports .onnx file conversion only.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elangai-1.0.0.tar.gz (9.4 kB view hashes)

Uploaded Source

Built Distribution

elangai-1.0.0-py3-none-any.whl (22.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page