Basic metrics for evaluating classification results
Project description
A confusion matrix is a summary of classification problem prediction results. The number of correct and incorrect predictions is summarized using count values and divided by class.
The function takes two arrays of same length and returns a list of four metrics i.e., TN, FP, FN, and TP By using confusion matrix we can calculate precision, recall, f1 score, FDR and accuracy.
CHANGE LOG
0.0.1 (16/01/2022)
First Release
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Close
Hashes for summary_classification-0.0.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 08ca856bd09f07cc045f40f53a5908405e172b7e08c2241d5260612965fafeb7 |
|
MD5 | 80eeb2a15bfdc75eabc6eed3dd606103 |
|
BLAKE2b-256 | 5b44dbd2a48ccb35bc526ddd63cb514f1f1e6d91116c2216c6709441049a7633 |