Skip to main content

Pyinfer is a model agnostic Python utility tool for ML developers and researchers to benchmark model inference statistics.

Project description

Pyinfer logo

build docs

Pyinfer is a model agnostic tool for ML developers and researchers to benchmark the inference statistics for machine learning models or functions.

Installation

pip install pyinfer

Overview

Inference Report

InferenceReport is for reporting inference statistics on a single model artifact. To create a valid report simply pass it a callable model function or method, valid input(s), and either n_iterations or n_seconds to determine what interval the report uses for its run duration. Check out the docs for more information on the optional parameters that can be passed.

Pyinfer Example Usage

Multi Inference Report

MultiInferenceReport is for reporting inference statistics on a list of model artifacts. To create a valid multi report pass it a list of callable model functions or methods, a list of valid input(s), and either n_iterations or n_seconds to determine what interval the report uses for its run duration. Check out the docs for more information on the optional parameters that can be passed.

Pyinfer Example Usage

Example Outputs

Table Report

Pyinfer Table Report

Run Plot

Pyinfer Report Plot

Stats Currently Included

  • Success Rate - Number of successful inferences within a specified time range.
  • Failures - Number of inferences above specified time range.
  • Time Taken - Total time taken to run all inferences.
  • Inference Per Second - Estimate of how many inferences per second the selected model can perform.
  • Max Run - The max time taken to perform an inference for a given run.
  • Min Run - The min time taken to perform an inference for a given run.
  • Std - The Standard deviation between runs.
  • Mean - The mean run time.
  • Median - The median run time.
  • IQR - The inter quartile range of the runs.
  • Cores Logical - The number of logical cores on the host machine.
  • Cores Physical - The number of physical Cores on the host machine.

Planned Future Stats

  • Model Size - Information relating to the size of the model in bytes.
  • GPU Stat Support - Information about if GPU is available and if it is being utilized.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyinfer-0.0.3.tar.gz (9.3 kB view details)

Uploaded Source

File details

Details for the file pyinfer-0.0.3.tar.gz.

File metadata

  • Download URL: pyinfer-0.0.3.tar.gz
  • Upload date:
  • Size: 9.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1.post20200622 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.7.7

File hashes

Hashes for pyinfer-0.0.3.tar.gz
Algorithm Hash digest
SHA256 97aa21f9c0b65eb0527f68efcf88bde1c1b1a39958335a6eb82361cf85ccd273
MD5 e5430deb8ad8989a17d660070561280e
BLAKE2b-256 a59bf23ce1d8c12d4d88d8cbdc27823dfad6911e6f33a3414a3f466790605208

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page