Skip to content

Commit

Permalink
add alternative IC for ROC metrics
Browse files Browse the repository at this point in the history
  • Loading branch information
giuseppec committed Nov 28, 2024
1 parent 03ffcac commit bef9851
Show file tree
Hide file tree
Showing 5 changed files with 85 additions and 6 deletions.
Binary file added exercises-pdf/ic_concepts_2.pdf
Binary file not shown.
Binary file added exercises-pdf/ic_sol_concepts_2.pdf
Binary file not shown.
8 changes: 2 additions & 6 deletions exercises/evaluation-tex/ex_rnw/ex_roc-metrics-allocation.Rnw
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@

\begin{enumerate}
\item \textbf{Confusion Matrix Completion and Explanation}
\begin{enumerate}
\item Below is an empty confusion matrix for a binary classification problem. Assign the correct terms (\( \text{TP} \), \( \text{FP} \), \( \text{FN} \), \( \text{TN} \)) to fields (A), (B), (C), (D) and explain what each term represents. % and how it is used to evaluate model performance.
\item
Below is an empty confusion matrix for a binary classification problem. Assign the correct terms (\( \text{TP} \), \( \text{FP} \), \( \text{FN} \), \( \text{TN} \)) to fields (A), (B), (C), (D) and explain what each term represents. % and how it is used to evaluate model performance.

\[
\begin{array}{c|c|c|}
Expand All @@ -14,9 +13,6 @@
\hline
\end{array}
\]


\end{enumerate}

\item Below is a table that incorrectly assigns 11 metrics derived from the confusion matrix to their corresponding formulas and descriptions.
The table contains the following columns:
Expand Down
65 changes: 65 additions & 0 deletions exercises/evaluation-tex/ex_rnw/sol_roc-metrics-allocation.Rnw
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
\begin{enumerate}
\item \textbf{Confusion Matrix Completion and Explanation}

\[
\begin{array}{c|c|c|}
\multicolumn{1}{c}{} & \textbf{Actual Positive} & \textbf{Actual Negative} \\
\hline
\textbf{Predicted Positive} & \text{TP (True Positive)} & \text{FP (False Positive)} \\
\hline
\textbf{Predicted Negative} & \text{FN (False Negative)} & \text{TN (True Negative)} \\
\hline
\end{array}
\]

\begin{itemize}
\item TP: Cases where the model correctly predicts the positive class.
\item FP: Cases where the model incorrectly predicts the positive class for a negative actual value.
\item FN: Cases where the model incorrectly predicts the negative class for a positive actual value.
\item TN: Cases where the model correctly predicts the negative class.
\end{itemize}

\item
Below a corrected table with properly assigned metric names, formulas, and descriptions:

\begin{tabular}{|p{6cm}|l|p{6cm}|}
\hline
\textbf{Metric Name} & \textbf{Formula} & \textbf{Description} \\ \hline
A) True Positive Rate (TPR) \newline B) Recall \newline C) Sensitivity & b) \( \frac{\text{TP}}{\text{TP} + \text{FN}} \) & 1) Proportion of actual positives correctly identified. \\ \hline

% B) Recall & b) \( \frac{\text{TP}}{\text{TP} + \text{FN}} \) & 1) Proportion of actual positives correctly identified. \\ \hline
% C) Sensitivity & b) \( \frac{\text{TP}}{\text{TP} + \text{FN}} \) & 1) Proportion of actual positives correctly identified. \\ \hline

D) True Negative Rate (TNR) \newline
E) Specificity & c) \( \frac{\text{TN}}{\text{TN} + \text{FP}} \) & 2) Proportion of actual negatives correctly identified. \\ \hline

%E) Specificity & c) \( \frac{\text{TN}}{\text{TN} + \text{FP}} \) & 2) Proportion of actual negatives correctly identified. \\ \hline

F) Precision \newline
G) Positive Predictive Value (PPV) & d) \( \frac{\text{TP}}{\text{TP} + \text{FP}} \) & 3) Proportion of positive predictions that are correct. \\ \hline
%G) Positive Predictive Value (PPV) & d) \( \frac{\text{TP}}{\text{TP} + \text{FP}} \) & 3) Proportion of positive predictions that are correct. \\ \hline
H) False Discovery Rate (FDR) & e) \( \frac{\text{FP}}{\text{TP} + \text{FP}} \) & 4) Proportion of positive predictions that are incorrect. \\ \hline
I) False Positive Rate (FPR) & f) \( \frac{\text{FP}}{\text{FP} + \text{TN}} \) & 5) Proportion of actual negatives incorrectly classified as positive. \\ \hline
J) False Negative Rate (FNR) & a) \( \frac{\text{FN}}{\text{TP} + \text{FN}} \) & 11) Proportion of actual positives incorrectly classified as negative. \\ \hline
K) Negative Predictive Value (NPV) & k) \( \frac{\text{TN}}{\text{TN} + \text{FN}} \) & 9) Proportion of negative predictions that are correct. \\ \hline
L) Accuracy & g) \( \frac{\text{TP} + \text{TN}}{\text{TP} + \text{FP} + \text{FN} + \text{TN}} \) & 6) Overall proportion of correct predictions. \\ \hline
M) False Omission Rate (FOR) & l) \( \frac{\text{FN}}{\text{FN} + \text{TN}} \) & 10) Proportion of negative predictions that are incorrect. \\ \hline
N) Prevalence & j) \( \frac{\text{TP} + \text{FN}}{\text{TP} + \text{FP} + \text{FN} + \text{TN}} \) & 8) Proportion of actual positives in the dataset. \\ \hline
O) F1 Score & h) \( \frac{2 \cdot \text{Precision} \cdot \text{Recall}}{\text{Precision} + \text{Recall}} \) & 7) Combines precision and recall using their harmonic mean. \\ \hline
\end{tabular}

Identified Synonyms:
\begin{itemize}
\item True Positive Rate (TPR), Recall, and Sensitivity refer to the same metric.
\item True Negative Rate (TNR) and Specificity refer to the same metric.
\item Precision and Positive Predictive Value (PPV) refer to the same metric.
\end{itemize}

Unassignable Formulas:
\begin{itemize}
\item \( i) \frac{2 \cdot (\text{Precision} + \text{Recall})}{\text{Precision} \cdot \text{Recall}} \)
\item \( o) \frac{\text{TP} - \text{FP}}{\text{TP} + \text{FN} + \text{FP} + \text{TN}} \)
\item \( n) \frac{\text{FN} + \text{FP}}{\text{TP} + \text{TN} + \text{FP} + \text{FN}} \)
\item \( m) \frac{\text{FN}}{\text{FP} + \text{FN}} \)
\end{itemize}
\end{enumerate}
18 changes: 18 additions & 0 deletions exercises/evaluation-tex/ic_sol_concepts_2.Rnw
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
% !Rnw weave = knitr

<<setup-child, include = FALSE>>=
library('knitr')
knitr::set_parent("../../style/preamble_ueb.Rnw")
@


\kopfic{6}{Evaluation}

\aufgabe{}{
<<child="ex_rnw/ex_roc-metrics-allocation.Rnw">>=
@
}
\loesung{}{
<<child="ex_rnw/sol_roc-metrics-allocation.Rnw">>=
@
}

0 comments on commit bef9851

Please sign in to comment.