Skip to main content

Pyinfer is a model agnostic Python utility tool for ML developers and researchers to benchmark model inference statistics.

Project description

Pyinfer logo

build docs

Pyinfer is a model agnostic tool for ML developers and researchers to benchmark the inference statistics for machine learning models or functions.

Installation

pip install pyinfer

Overview

Inference Report

InferenceReport is for reporting inference statistics on a single model artifact. To create a valid report simply pass it a callable model function or method, valid input(s), and either n_iterations or n_seconds to determine what interval the report uses for its run duration. Check out the docs for more information on the optional parameters that can be passed.

Pyinfer Example Usage

Multi Inference Report

MultiInferenceReport is for reporting inference statistics on a list of model artifacts. To create a valid multi report pass it a list of callable model functions or methods, a list of valid input(s), and either n_iterations or n_seconds to determine what interval the report uses for its run duration. Check out the docs for more information on the optional parameters that can be passed.

Pyinfer Example Usage

Example Outputs

Table Report

Pyinfer Table Report

Run Plot

Pyinfer Report Plot

Stats Currently Included

  • Success Rate - Number of successful inferences within a specified time range.
  • Failures - Number of inferences above specified time range.
  • Time Taken - Total time taken to run all inferences.
  • Inference Per Second - Estimate of how many inferences per second the selected model can perform.
  • Max Run - The max time taken to perform an inference for a given run.
  • Min Run - The min time taken to perform an inference for a given run.
  • Std - The Standard deviation between runs.
  • Mean - The mean run time.
  • Median - The median run time.
  • IQR - The inter quartile range of the runs.
  • Cores Logical - The number of logical cores on the host machine.
  • Cores Physical - The number of physical Cores on the host machine.

Planned Future Stats

  • Model Size - Information relating to the size of the model in bytes.
  • GPU Stat Support - Information about if GPU is available and if it is being utilized.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyinfer-0.0.3.tar.gz (9.3 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page