From 0978f8d6c50b3aebfb1160c1edba1ce513e45c40 Mon Sep 17 00:00:00 2001 From: Jen Looper Date: Mon, 24 May 2021 15:23:33 -0400 Subject: [PATCH] link edit --- Regression/4-Logistic/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Regression/4-Logistic/README.md b/Regression/4-Logistic/README.md index 8a0c3bda..421d4fb8 100644 --- a/Regression/4-Logistic/README.md +++ b/Regression/4-Logistic/README.md @@ -171,7 +171,7 @@ Predicted labels: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 ## Better comprehension via a confusion matrix -While you can get a scoreboard report [terms](https://scikit-learn.org/stable/modules/generated/sklearn.metrics.classification_report.html?highlight=classification_report#sklearn.metrics.classification_report) by printing out the items above, you might be able to understand your model more easily by using a [confusion matrix]() to help us understand how the model is performing. +While you can get a scoreboard report [terms](https://scikit-learn.org/stable/modules/generated/sklearn.metrics.classification_report.html?highlight=classification_report#sklearn.metrics.classification_report) by printing out the items above, you might be able to understand your model more easily by using a [confusion matrix](https://scikit-learn.org/stable/modules/model_evaluation.html#confusion-matrix) to help us understand how the model is performing. > 🎓 A '[confusion matrix](https://en.wikipedia.org/wiki/Confusion_matrix)' (or 'error matrix') is a table that expresses your model's true vs. false positives and negatives, thus gauging the accuracy of predictions.