Skip to main content

package for inferencing models developed at phosphene

Project description

Torch CUDA install

If the identity_clustering package is installing the CPU version, you can install the right CUDA versions by manually installing it or running the following command:

python -m model_inference.torch_install

Sphinx Documentation

To view Sphinx documentation, clone this repository and run the following commands

cd pkg-faceswap-inference

#install sphinx and a theme.
pip install sphinx
pip install sphinx-rtd-theme

# generate docs
sphinx-build -b html docs/ build/

open the build folder there you will find index.html , open it with a browser


Inference Class Documentation

The Inference class provides functionalities for inferencing video files with models. It includes utilities for timing and tracking nested function calls to aid in performance analysis.

Table of Contents


Class Attributes

  • device (str): Specifies the device (cpu or cuda) used for computation.
  • shape (tuple): Dimensions for resizing clustered faces, default is (224, 224).
  • classes (list): The categories the model can predict, such as ["Real", "Fake"].
  • timings (dict): Stores timing data for functions with nested call relationships.
  • _clusters (None or dict): Stores clusters of faces post-clustering.

Methods

__init__(self, device: str, shape=(224, 224))

  • Initializes the Inference class with specified device and shape attributes.

generate_video_data(self, video_path: str, print_timings=True)

  • Processes a video to detect, crop, and cluster faces, and converts them to RGB format.

get_data(self, video_path: str, print_timings=True)

  • Retrieves essential data for frames, bounding boxes, images, FPS, and clustered identities from a video.

get_predictions(self, model, images: torch.Tensor, device='cuda')

  • Runs predictions on clustered face images using a model and returns logits and labels.

__cvt_to_rgb(self, faces: tuple) -> torch.Tensor

  • Converts images from BGR to RGB format.

__plot_images_grid(self, tensor: torch.Tensor, images_per_row=4)

  • Plots a grid of images from a 4D tensor.

__print_result(self, result: dict, image_data: List[torch.Tensor])

  • Displays results, including images, based on the model’s predictions.

__print_timings(self, timings: dict)

  • Outputs timing information for functions with nested call relationships.

__create_sequence_dict(self, identity_data)

  • Organizes identity information into a sequence dictionary.

__draw_bounding_boxes(self, video_path: str, sequence_dict, result_video_path: str)

  • Draws bounding boxes on faces in the video and saves the processed output.

Decorator timeit(func)

  • Times functions and records their nested relationships in the timings attribute.

Usage

Here's a simple example of using the Inference class:

from inference import Inference
import torch
from models.models_list import ModelList

# Initialize Inference class
device = "cuda" if torch.cuda.is_available() else "cpu"
inference = Inference(device=device)

# Load your model from ModelList
model = ModelList().load_model("face_detection_model", device)

# Process video
video_path = "/path/to/video.mp4"
output_data, num_clusters = inference.generate_video_data(video_path)

# Get predictions for each cluster
for cluster_images in output_data:
    predictions = inference.get_predictions(model, cluster_images, device)
    inference.__print_result(predictions, cluster_images)

License

This project is licensed under the MIT License. See the LICENSE file for details.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

model_inference-0.0.6.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

model_inference-0.0.6-py3-none-any.whl (10.8 kB view details)

Uploaded Python 3

File details

Details for the file model_inference-0.0.6.tar.gz.

File metadata

  • Download URL: model_inference-0.0.6.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.12.8 Linux/6.8.0-1020-azure

File hashes

Hashes for model_inference-0.0.6.tar.gz
Algorithm Hash digest
SHA256 7b671abda5ab81d763f349357ceaf58782b52121e999e0b0a7110134e0127956
MD5 a1fa162742f7e28845784044a5e35563
BLAKE2b-256 acfe20fecf4e6c7a0ba4ff15ad2c0b7d99e21bbaadbede10016669dc84e40ca3

See more details on using hashes here.

File details

Details for the file model_inference-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: model_inference-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 10.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.12.8 Linux/6.8.0-1020-azure

File hashes

Hashes for model_inference-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 17ab346de55058b7efbe3e7267002afc75f2e238ad2466e7a0750d22ab4667ae
MD5 fcbbb92c87187bb68ac700739ec81b4d
BLAKE2b-256 43c2fe082e306d8dca9262c21465beebb3cf977116a9aee4abfa8b3725d68f61

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page