Skip to main content

A package for running EdgeYOLO models using ONNX, TensorFlow Lite, and CoreML on images and videos

Project description

EdgeYOLO Runner

A Python package for running EdgeYOLO models using ONNX Runtime, TensorFlow Lite, and CoreML on images and videos. This package supports both FP32 and FP16 precision, and can run on both CPU and GPU acceleration.

Installation

Basic Installation (ONNX support only)

pip install edgeyolo_runner

With TensorFlow Lite Support

pip install edgeyolo_runner[tflite]
# or for all optional dependencies
pip install edgeyolo_runner[all]

Usage

Using the DetectorFactory (Recommended)

The DetectorFactory automatically detects the model type based on the file extension:

from edgeyolo_runner import DetectorFactory
import cv2

# Auto-detect model type (.onnx or .tflite)
detector = DetectorFactory.create_detector(
    model_path="path/to/your/model.onnx",  # or .tflite
    detector_type="auto",  # Auto-detect based on file extension
    conf_thres=0.25,
    nms_thres=0.5,
    use_acceleration=True  # CUDA for ONNX, GPU for TFLite
)

# Read and process image
image = cv2.imread("image.jpg")
detections = detector(image)

# Process detections
if detections is not None:
    for det in detections:
        boxes = det[:4]
        score = det[4]
        class_id = int(det[5])
        # Draw boxes, etc.

ONNX Detector

from edgeyolo_runner import ONNXDetector
import cv2

# Initialize ONNX detector
detector = ONNXDetector(
    model_path="path/to/your/model.onnx",
    conf_thres=0.25,
    nms_thres=0.5,
    fp16=False,  # Set to True for FP16 inference
    use_cuda=True  # Set to False for CPU inference
)

# Read and process image
image = cv2.imread("image.jpg")
detections = detector(image)

TensorFlow Lite Detector

from edgeyolo_runner import TFLiteDetector
import cv2

# Initialize TFLite detector
detector = TFLiteDetector(
    model_path="path/to/your/model.tflite",
    conf_thres=0.25,
    nms_thres=0.5,
    fp16=False,  # Set to True for FP16 inference
    use_gpu=True  # Set to False for CPU inference
)

# Read and process image
image = cv2.imread("image.jpg")
detections = detector(image)

Video Detection

from edgeyolo_runner import DetectorFactory
import cv2

# Works with both ONNX and TFLite models
detector = DetectorFactory.create_detector("path/to/your/model.onnx")

cap = cv2.VideoCapture("video.mp4")
while cap.isOpened():
    ret, frame = cap.read()
    if not ret:
        break
        
    detections = detector(frame)
    # Process detections and draw on frame
    
    print(f"Inference time: {detector.dt*1000:.2f}ms")
    
cap.release()

Command Line Usage

Use the provided example script to compare detectors:

# Auto-detect model type and run inference
python examples/detector_comparison.py --model model.onnx --image image.jpg

# Specifically use TFLite detector
python examples/detector_comparison.py --model model.tflite --image image.jpg --detector tflite

# Use hardware acceleration
python examples/detector_comparison.py --model model.onnx --image image.jpg --use-acceleration

# Save output
python examples/detector_comparison.py --model model.onnx --image image.jpg --output result.jpg

Features

  • Dual Runtime Support: Both ONNX Runtime and TensorFlow Lite
  • Auto Model Detection: Automatic model type detection based on file extension
  • Hardware Acceleration: CUDA support for ONNX, GPU delegation for TFLite
  • Precision Options: FP16 and FP32 precision support
  • Easy-to-use API: Unified interface for both model types
  • Batch Processing: Support for processing multiple images
  • Performance Monitoring: Built-in inference time measurement

Model Support

  • ONNX Models: .onnx files using ONNX Runtime
  • TensorFlow Lite Models: .tflite files using TensorFlow Lite

Requirements

Core Dependencies

  • Python >= 3.7
  • ONNX Runtime >= 1.15.0
  • OpenCV >= 4.5.0
  • NumPy >= 1.19.0
  • Pillow >= 8.0.0

Optional Dependencies

  • TensorFlow >= 2.8.0: For TensorFlow Lite support
  • ONNX Runtime GPU: For CUDA acceleration

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

edgeyolo_runner-0.2.0.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

edgeyolo_runner-0.2.0-py3-none-any.whl (12.5 kB view details)

Uploaded Python 3

File details

Details for the file edgeyolo_runner-0.2.0.tar.gz.

File metadata

  • Download URL: edgeyolo_runner-0.2.0.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.19

File hashes

Hashes for edgeyolo_runner-0.2.0.tar.gz
Algorithm Hash digest
SHA256 cefb064883d1501d99ed212850ddd56da7098d7af44ce12973fd097553f87ef3
MD5 ec1287b84741704b92df7f3b90c65abf
BLAKE2b-256 a7604e65723fa17007d6edbd6ddb0d6d37dec3059c12aa6dd6e35fc19c772ad6

See more details on using hashes here.

File details

Details for the file edgeyolo_runner-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for edgeyolo_runner-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9dca3746a0504137506bdfcd53d784a96f4020b226fc9e0a7ad8fc8754ab756e
MD5 6600ea8e3221c50b42cd11b7f27d50c2
BLAKE2b-256 4d70a35292aad94ea329f4ac818a1f26607a80ba679425319403fe195cb3ee40

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page