Basic metrics for evaluating classification results
Project description
A confusion matrix is a summary of classification problem prediction results. The number of correct and incorrect predictions is summarized using count values and divided by class.
The function takes two arrays of same length and returns a list of four metrics i.e., TN, FP, FN, and TP By using confusion matrix we can calculate precision, recall, f1 score, FDR and accuracy. -Calculate True positive, true negative, false positive and false negative -Accuracy calculates the number of times classifier predicts correctly. -F1 score: Harmonic mean of precision and recall. -Precision: What % of predicted Positive aspects are actually Positive? -Recall: How many actual Positives are correctly classified? -False dicovery rate (FDR)
CHANGE LOG
0.0.1 (16/01/2022)
First Release
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.