Skip to main content

A package for running EdgeYOLO models using ONNX, TensorFlow Lite, and CoreML on images and videos

Project description

EdgeYOLO Runner

A Python package for running EdgeYOLO models using ONNX Runtime, TensorFlow Lite, and CoreML on images and videos. This package supports both FP32 and FP16 precision, and can run on both CPU and GPU acceleration.

Installation

Basic Installation (ONNX support only)

pip install edgeyolo_runner

With TensorFlow Lite Support

pip install edgeyolo_runner[tflite]
# or for all optional dependencies
pip install edgeyolo_runner[all]

Usage

Using the DetectorFactory (Recommended)

The DetectorFactory automatically detects the model type based on the file extension:

from edgeyolo_runner import DetectorFactory
import cv2

# Auto-detect model type (.onnx or .tflite)
detector = DetectorFactory.create_detector(
    model_path="path/to/your/model.onnx",  # or .tflite
    detector_type="auto",  # Auto-detect based on file extension
    conf_thres=0.25,
    nms_thres=0.5,
    use_acceleration=True  # CUDA for ONNX, GPU for TFLite
)

# Read and process image
image = cv2.imread("image.jpg")
detections = detector(image)

# Process detections
if detections is not None:
    for det in detections:
        boxes = det[:4]
        score = det[4]
        class_id = int(det[5])
        # Draw boxes, etc.

ONNX Detector

from edgeyolo_runner import ONNXDetector
import cv2

# Initialize ONNX detector
detector = ONNXDetector(
    model_path="path/to/your/model.onnx",
    conf_thres=0.25,
    nms_thres=0.5,
    fp16=False,  # Set to True for FP16 inference
    use_cuda=True  # Set to False for CPU inference
)

# Read and process image
image = cv2.imread("image.jpg")
detections = detector(image)

TensorFlow Lite Detector

from edgeyolo_runner import TFLiteDetector
import cv2

# Initialize TFLite detector
detector = TFLiteDetector(
    model_path="path/to/your/model.tflite",
    conf_thres=0.25,
    nms_thres=0.5,
    fp16=False,  # Set to True for FP16 inference
    use_gpu=True  # Set to False for CPU inference
)

# Read and process image
image = cv2.imread("image.jpg")
detections = detector(image)

Video Detection

from edgeyolo_runner import DetectorFactory
import cv2

# Works with both ONNX and TFLite models
detector = DetectorFactory.create_detector("path/to/your/model.onnx")

cap = cv2.VideoCapture("video.mp4")
while cap.isOpened():
    ret, frame = cap.read()
    if not ret:
        break
        
    detections = detector(frame)
    # Process detections and draw on frame
    
    print(f"Inference time: {detector.dt*1000:.2f}ms")
    
cap.release()

Command Line Usage

Use the provided example script to compare detectors:

# Auto-detect model type and run inference
python examples/detector_comparison.py --model model.onnx --image image.jpg

# Specifically use TFLite detector
python examples/detector_comparison.py --model model.tflite --image image.jpg --detector tflite

# Use hardware acceleration
python examples/detector_comparison.py --model model.onnx --image image.jpg --use-acceleration

# Save output
python examples/detector_comparison.py --model model.onnx --image image.jpg --output result.jpg

Features

  • Dual Runtime Support: Both ONNX Runtime and TensorFlow Lite
  • Auto Model Detection: Automatic model type detection based on file extension
  • Hardware Acceleration: CUDA support for ONNX, GPU delegation for TFLite
  • Precision Options: FP16 and FP32 precision support
  • Easy-to-use API: Unified interface for both model types
  • Batch Processing: Support for processing multiple images
  • Performance Monitoring: Built-in inference time measurement

Model Support

  • ONNX Models: .onnx files using ONNX Runtime
  • TensorFlow Lite Models: .tflite files using TensorFlow Lite

Requirements

Core Dependencies

  • Python >= 3.7
  • ONNX Runtime >= 1.15.0
  • OpenCV >= 4.5.0
  • NumPy >= 1.19.0
  • Pillow >= 8.0.0

Optional Dependencies

  • TensorFlow >= 2.8.0: For TensorFlow Lite support
  • ONNX Runtime GPU: For CUDA acceleration

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

edgeyolo_runner-0.1.0.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

edgeyolo_runner-0.1.0-py3-none-any.whl (12.5 kB view details)

Uploaded Python 3

File details

Details for the file edgeyolo_runner-0.1.0.tar.gz.

File metadata

  • Download URL: edgeyolo_runner-0.1.0.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.19

File hashes

Hashes for edgeyolo_runner-0.1.0.tar.gz
Algorithm Hash digest
SHA256 14b6e12469ca50708314efbabe9df35f6475babe6eaa2236012570d6831bd30e
MD5 57413ddda207fbfc11bf7e74163d32d9
BLAKE2b-256 ccd3658614960b2e63e151b46040535cc02b8a8cbbf10c0d474d01e4f6077836

See more details on using hashes here.

File details

Details for the file edgeyolo_runner-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for edgeyolo_runner-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 eff6242f586d1fabcd19de3ba91d0d15dbf73e33f5121b447eed5b794dcde579
MD5 61dbcad8b69aea4c208085ca2e2e8758
BLAKE2b-256 d3c104a03bc9ba4ab39335a295303bfa085d0ea7ea4c4333a20f2ad82584c61e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page