Skip to content

Commit

Permalink
Merge overleaf-2024-11-07-0946 into master
Browse files Browse the repository at this point in the history
  • Loading branch information
ludwigbothmann authored Nov 7, 2024
2 parents 089af83 + 2aa5f4a commit 5d2756e
Showing 1 changed file with 14 additions and 12 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -45,24 +45,23 @@
\begin{vbframe}{Univariate example}
\begin{small}
\begin{itemize}
\item Assume we observed the body height in cm of 10 men and 10 women.
\item For a given body height of a new person, we want to classify that person to the classes \textit{man} or \textit{woman}.
\item LDA fits a normal distribution for both classes with equal standard deviations, i.e., both distributions have the same shape.
\item Given the heights (in cm) of 10 men and 10 women, we aim to classify a new person as male or female based on their height.
\item LDA models both classes using normal distributions with equal standard deviations (identical shapes).
\end{itemize}
\begin{center}
\includegraphics[width=0.85\textwidth, clip=true, trim={0 0 0 0}]{figure/disc_univariate-1.png}
\end{center}
The optimal separation is located at the intersection (= decision boundary)!
\centerline{The optimal separation is located at the intersection (= decision boundary)!}
\end{small}
\end{vbframe}

\begin{vbframe}{Univariate example: equal class sizes}
\begin{small}
We can use our learned distributions to answer questions, e.g., how likely is somebody with body height of 172 cm a male person?
Using our learned distributions, we can compute the posterior probability that a 172 cm tall person is male.
\begin{center}
\includegraphics[width=0.85\textwidth, clip=true, trim={0 0 0 0}]{figure/disc_univariate-2.png}
\end{center}
For equal class sizes, the prior probs $\pik$ cancel out:
For equal class sizes, the prior probs $\pik$ cancel out (since $\pi_{man} = \pi_{woman}$):
$$
\P(y = \text{man} \mid \xv) = \frac{p(\xv \mid y = \text{man})}{p(\xv \mid y = \text{man}) + p(\xv \mid y = \text{woman})} = \frac{0.0135}{0.0135 + 0.088} = 0.133
$$
Expand All @@ -71,14 +70,17 @@

\begin{vbframe}{Univariate example: unequal class sizes}
\begin{small}
\begin{itemize}
\item Usually class sizes are not always equal, e.g., a class might have a higher occurence.
\item The formula to compute the \textit{posterior probability} does not simplify and the optimal linear separation will be shifted to the class with lower occurence.
\end{itemize}
For unequal class sizes (e.g., $\pi_{woman} = 2\pi_{man}$), the prior probs matter and cause a shift of the decision boundary towards the smaller class.
\begin{center}
\includegraphics[width=\textwidth, clip=true, trim={0 0 0 0}]{figure/disc_univariate-3.png}
\includegraphics[width=0.86\textwidth, clip=true, trim={0 0 0 0}]{figure/disc_univariate-3.png}
\end{center}
\begin{align*}
\P(y = \text{man} \mid \xv) &= \frac{p(\xv \mid y = \text{man}) \pi_{man}}{p(\xv \mid y = \text{man}) \pi_{man} + p(\xv \mid y = \text{woman}) \pi_{woman}}\\
&=\frac{0.0135 \tfrac{1}{3}}{0.0135 \tfrac{1}{3} + 0.088 \tfrac{2}{3}} = 0.0712
\end{align*}

\end{small}

\end{vbframe}
\begin{vbframe}{LDA decision boundaries}

Expand Down Expand Up @@ -146,7 +148,7 @@

\end{vbframe}

\begin{vbframe}{QDA in 1D}
\begin{vbframe}{Univariate Example with QDA}
\begin{small}
Different covariance matrices lead to multiple classification rules:
\begin{itemize}
Expand Down

0 comments on commit 5d2756e

Please sign in to comment.