From c13f28c97104fdf3c0ca39ecd7a8f93d73ae66d5 Mon Sep 17 00:00:00 2001 From: RyanXin Date: Mon, 26 Jul 2021 13:12:18 +0800 Subject: [PATCH] Correct the definitions of precision and recall. --- 2-Regression/4-Logistic/README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/2-Regression/4-Logistic/README.md b/2-Regression/4-Logistic/README.md index a4488c11..8d89ce9d 100644 --- a/2-Regression/4-Logistic/README.md +++ b/2-Regression/4-Logistic/README.md @@ -237,9 +237,9 @@ As you might have guessed it's preferable to have a larger number of true positi Let's revisit the terms we saw earlier with the help of the confusion matrix's mapping of TP/TN and FP/FN: -🎓 Precision: TP/(TP + FN) The fraction of relevant instances among the retrieved instances (e.g. which labels were well-labeled) +🎓 Precision: TP/(TP + FP) The fraction of relevant instances among the retrieved instances (e.g. which labels were well-labeled) -🎓 Recall: TP/(TP + FP) The fraction of relevant instances that were retrieved, whether well-labeled or not +🎓 Recall: TP/(TP + FN) The fraction of relevant instances that were retrieved, whether well-labeled or not 🎓 f1-score: (2 * precision * recall)/(precision + recall) A weighted average of the precision and recall, with best being 1 and worst being 0