diff --git a/AA2324/course/04_pca_svd_high_dim/04_pca_svd_high_dim.ipynb b/AA2324/course/04_pca_svd_high_dim/04_pca_svd_high_dim.ipynb
index f892d17..579cfbc 100644
--- a/AA2324/course/04_pca_svd_high_dim/04_pca_svd_high_dim.ipynb
+++ b/AA2324/course/04_pca_svd_high_dim/04_pca_svd_high_dim.ipynb
@@ -298,7 +298,7 @@
"source": [
"# Geometry of SVD \n",
"\n",
- "Source: wikipedia\n",
+ "Source: Wikipedia\n",
""
]
},
@@ -354,7 +354,7 @@
}
},
"source": [
- "### Find projection that maximizes the spread of the data\n",
+ "### Find a projection that maximizes the spread of the data\n",
"\n",
"- Find $\\mathbf{u}\\in \\mathbb{R}^k$ $\\left\\|\\mathbf{u}\\right\\|_2=1$ for which you can project the data $\\mathbf{x}^{\\prime}=\\mathbb{P}_{\\mathbf{u}}\\mathbf{x}$\n",
"\n",
@@ -480,12 +480,12 @@
}
},
"source": [
- "# What happens in high dimension?\n",
+ "# What happens in high dimensions?\n",
"\n",
"- Are we supposed to think about the world in a 3 dimensional space?\n",
"- Our brain can think in a 3 dimensional world.\n",
"- Maybe with relativity theory we arrive into a 4-D space if we consider time also.\n",
- "- So why we need higher dimensions?"
+ "- So why do we need higher dimensions?"
]
},
{
@@ -499,7 +499,7 @@
"# High Dimensional Space\n",
"\n",
"- Visualizing one hundred dimensional space is incredibly difficult for humans.\n",
- "- Most of time the representation for an input datum is a vector in **high dimensional space.**"
+ "- Most of the time the representation for an input datum is a vector in **high dimensional space.**"
]
},
{
@@ -631,10 +631,10 @@
"## Best Practice: Always look at the data before start working!\n",
"\n",
"- Do not do simple summary statistics on the dataset\n",
- "- Always look at a large random samples of the dataset\n",
+ "- Always look at a large random sample of the dataset\n",
"- The data may **contain noise** that you want to be aware of!\n",
"- Do not give for granted that the data **and** the labels are noise-free\n",
- "- You could try to plot the data to lower dimension also (i.e. with PCA)"
+ "- You could try to plot the data to a lower dimension also (i.e. with PCA)"
]
},
{
@@ -824,7 +824,7 @@
"source": [
"# Sampling\n",
"samples = np.random.uniform(0, 255, (100, 62, 47)).astype(np.uint8)\n",
- "# sampling unifirmly 100 points in 62x47 space.\n",
+ "# sampling uniformly 100 points in 62x47 space.\n",
"\n",
"# Plot the faces\n",
"N_ax, N_img = 10, 10 # 10 rows with 10 images per row\n",
@@ -862,7 +862,7 @@
"source": [
"# Sampling\n",
"samples = np.random.uniform(0, 255, (100, 62, 47)).astype(np.uint8)\n",
- "# sampling unifirmly 100 points in 62x47 space.\n",
+ "# sampling uniformly 100 points in 62x47 space.\n",
"\n",
"# Plot the faces\n",
"N_ax, N_img = 10, 10 # 10 rows with 10 images per row\n",
@@ -884,7 +884,7 @@
},
"source": [
"### The probability of hitting a face is very small!\n",
- "The phenomenon is also know as..."
+ "The phenomenon is also known as..."
]
},
{
@@ -1002,9 +1002,9 @@
"source": [
"### Distances in high dimensional space\n",
"\n",
- "Distances in high dimension follow a pattern:\n",
+ "Distances in high dimensions follow a pattern:\n",
"\n",
- "- Distance of points sampled on the unit sphere goes to zero as D increases.\n",
+ "- The distance of points sampled on the unit sphere goes to zero as D increases.\n",
"- Euclidean distance of points goes to $4\\sqrt{D}$.\n",
"- The variance of distances between two random points keeps shrinking as D increases.\n",
"\n",
@@ -1331,7 +1331,7 @@
}
],
"source": [
- "############### Fittin with 3 component ######\n",
+ "############### Fittin with 3 components ######\n",
"pca = PCA(n_components=3) # retain 3 components\n",
"pca.fit(faces.data)\n",
"#############################################\n",
@@ -1474,7 +1474,7 @@
"source": [
"# Why compression\n",
"\n",
- "This example is just used for illustrative purpose since **Machine Learning** is much related to **Information Theory**.\n",
+ "This example is just used for illustrative purposes since **Machine Learning** is much related to **Information Theory**.\n",
"\n",
""
]
@@ -1609,7 +1609,7 @@
}
},
"source": [
- "# Any question of previous lectures before moving on?\n",
+ "# Any questions of previous lectures before moving on?\n",
"\n",
"- We will review a few concept of PCA at the end of matrix calculus"
]
@@ -1624,7 +1624,7 @@
"source": [
"# Matrix Calculus\n",
"\n",
- "Part of this lectures are taken from:\n",
+ "Part of these lectures are taken from:\n",
"\n",
"- [CS229 LinAlg](http://cs229.stanford.edu/summer2019/cs229-linalg.pdf)\n",
"- [CS229 Calculus Recap](https://www.youtube.com/watch?v=b0HvwszmqcQ)\n",
@@ -1698,7 +1698,7 @@
"\n",
"\n",
"- Let's take a complex function $$f(x) = \\sin(x^x)$$ over the $[0, 3]$. \n",
- "- Its behaviour is not simple to understand."
+ "- Its behavior is not simple to understand."
]
},
{
@@ -2059,7 +2059,7 @@
"$$\\mbf{x}^T \\mbf{A}\\mbf{x}$$\n",
"\n",
"- It is vector to scalar function\n",
- "- It used for characterizing **Definiteness** of matrices"
+ "- It is used for characterizing **Definiteness** of matrices"
]
},
{