From 1b6274f711abec765549d1c949da065044e14879 Mon Sep 17 00:00:00 2001 From: Jasleen Sondhi Date: Sun, 17 Sep 2023 03:30:02 +0530 Subject: [PATCH] refactored text --- 2-Regression/4-Logistic/solution/R/lesson_4.Rmd | 17 ----------------- 1 file changed, 17 deletions(-) diff --git a/2-Regression/4-Logistic/solution/R/lesson_4.Rmd b/2-Regression/4-Logistic/solution/R/lesson_4.Rmd index 18407b31..16610df3 100644 --- a/2-Regression/4-Logistic/solution/R/lesson_4.Rmd +++ b/2-Regression/4-Logistic/solution/R/lesson_4.Rmd @@ -72,17 +72,6 @@ Logistic regression does not offer the same features as linear regression. The f ![Infographic by Dasani Madipalli](../../images/pumpkin-classifier.png){width="600"} -#### **Other classifications** - -There are other types of logistic regression, including multinomial and ordinal: - -- **Multinomial**, which involves having more than one category - "Orange, White, and Striped". - -- **Ordinal**, which involves ordered categories, useful if we wanted to order our outcomes logically, like our pumpkins that are ordered by a finite number of sizes (mini,sm,med,lg,xl,xxl). - -![Multinomial vs ordinal regression](https://github.com/microsoft/ML-For-Beginners/blob/main/2-Regression/4-Logistic/images/multinomial-vs-ordinal.png) - - #### **Variables DO NOT have to correlate** Remember how linear regression worked better with more correlated variables? Logistic regression is the opposite - the variables don't have to align. That works for this data which has somewhat weak correlations. @@ -238,11 +227,8 @@ baked_pumpkins %>% scale_color_brewer(palette = "Dark2", direction = -1) + theme(legend.position = "none") ``` - - Now that we have an idea of the relationship between the binary categories of color and the larger group of sizes, let's explore logistic regression to determine a given pumpkin's likely color. - ## 3. Build your model Let's begin by splitting the data into `training` and `test` sets. The training set is used to train a classifier so that it finds a statistical relationship between the features and the label value. @@ -262,8 +248,6 @@ pumpkins_test <- testing(pumpkins_split) # Print out the first 5 rows of the training set pumpkins_train %>% slice_head(n = 5) - - ``` 🙌 We are now ready to train a model by fitting the training features to the training label (color). @@ -299,7 +283,6 @@ log_reg_wf <- workflow() %>% # Print out the workflow log_reg_wf - ``` After a workflow has been *specified*, a model can be `trained` using the [`fit()`](https://tidymodels.github.io/parsnip/reference/fit.html) function. The workflow will estimate a recipe and preprocess the data before training, so we won't have to manually do that using prep and bake.