Skip to main content

Does the same thing as sklearn.metrics but 100x faster.

Project description

Fast Metrics

This project provides a set of faster alternatives to some of the methods in sklearn.metrics. If you use any of the classification metrics supported by this package, you will see a vast improvement in the runtime of your code, up to 100x depending on the function used and the size of the data set. Speedups were achieved by using Numba to vectorize the generation of confusion matrices.

Fast Metrics supports the following metrics:

  • Accuracy Score
  • Balanced Accuracy Score
  • F1 Score
  • Precision
  • Recall
  • Jaccard Score
  • ROC AUC

Installation

Run the following to install:

pip install fastmetrics

Usage

Use Fast Metrics the same way you would use sklearn.metrics. Feed the methods a NumPy array of true results and a NumPy array of predictions, and they will return their respective metric. For instance:

fastmetrics.fast_accuracy_score(y_true, y_pred)
fastmetrics.fast_balanced_accuracy_score(y_true, y_pred)
fastmetrics.fast_f1_score(y_true, y_pred)

Additional Notes:

  • Unlike sklearn, Fast Metrics methods do not have additional optional arguments beyond y_true and y_pred.
  • The first time you run a Fast Metrics function, it will be slower than its sklearn counterpart because the vectorized confusion matrix code needs time to compile.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastmetrics-0.0.10.tar.gz (3.5 kB view hashes)

Uploaded Source

Built Distribution

fastmetrics-0.0.10-py3-none-any.whl (3.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page