Specificity is the number of negatives that were accurately predicted, calculated as TN/(TN+FP) (note that false positives are actually negatives). By having model maximize sensitivity, its ability to prioritize correctly classify positives is targeted. a diagnosis of ‘no cancer’ in someone who does have cancer) are more deadly. In some cases, false positives can be dangerous, but it is generally agreed upon that false negatives (e.g. Sensitivity is a good metric to use in contexts where correctly predicting positives is important, like medical diagnoses. This is calculated as TP/(TP+FN) (note that false negatives are positives). Sensitivity/Recall is the number of positives that were accurately predicted. ![]() Many classifier metrics are based on the confusion matrix, so it’s helpful to keep an image of it stored in your mind. It contains four cells, each corresponding to one combination of a predicted true or false and an actual true or false. Confusion Matrix is a matrix used to indicate a classifier’s predictions on labels.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |