Skip to content

Commit

Permalink
Merge pull request #90 from kk-Syuer/patch-2
Browse files Browse the repository at this point in the history
update of some typos found
  • Loading branch information
iacopomasi authored Mar 21, 2024
2 parents cce82e7 + 6cc625d commit cab835c
Showing 1 changed file with 18 additions and 18 deletions.
36 changes: 18 additions & 18 deletions AA2324/course/04_pca_svd_high_dim/04_pca_svd_high_dim.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -298,7 +298,7 @@
"source": [
"# Geometry of SVD \n",
"\n",
"Source: wikipedia\n",
"Source: Wikipedia\n",
"<img width='50%' src=\"https://upload.wikimedia.org/wikipedia/commons/b/bb/Singular-Value-Decomposition.svg\" />"
]
},
Expand Down Expand Up @@ -354,7 +354,7 @@
}
},
"source": [
"### Find projection that maximizes the spread of the data\n",
"### Find a projection that maximizes the spread of the data\n",
"\n",
"- Find $\\mathbf{u}\\in \\mathbb{R}^k$ $\\left\\|\\mathbf{u}\\right\\|_2=1$ for which you can project the data $\\mathbf{x}^{\\prime}=\\mathbb{P}_{\\mathbf{u}}\\mathbf{x}$\n",
"\n",
Expand Down Expand Up @@ -480,12 +480,12 @@
}
},
"source": [
"# What happens in high dimension?\n",
"# What happens in high dimensions?\n",
"\n",
"- Are we supposed to think about the world in a 3 dimensional space?\n",
"- Our brain can think in a 3 dimensional world.\n",
"- Maybe with relativity theory we arrive into a 4-D space if we consider time also.\n",
"- So why we need higher dimensions?"
"- So why do we need higher dimensions?"
]
},
{
Expand All @@ -499,7 +499,7 @@
"# High Dimensional Space\n",
"\n",
"- Visualizing one hundred dimensional space is incredibly difficult for humans.\n",
"- Most of time the representation for an input datum is a vector in **high dimensional space.**"
"- Most of the time the representation for an input datum is a vector in **high dimensional space.**"
]
},
{
Expand Down Expand Up @@ -631,10 +631,10 @@
"## Best Practice: Always look at the data before start working!\n",
"\n",
"- Do not do simple summary statistics on the dataset\n",
"- Always look at a large random samples of the dataset\n",
"- Always look at a large random sample of the dataset\n",
"- The data may **contain noise** that you want to be aware of!\n",
"- Do not give for granted that the data **and** the labels are noise-free\n",
"- You could try to plot the data to lower dimension also (i.e. with PCA)"
"- You could try to plot the data to a lower dimension also (i.e. with PCA)"
]
},
{
Expand Down Expand Up @@ -824,7 +824,7 @@
"source": [
"# Sampling\n",
"samples = np.random.uniform(0, 255, (100, 62, 47)).astype(np.uint8)\n",
"# sampling unifirmly 100 points in 62x47 space.\n",
"# sampling uniformly 100 points in 62x47 space.\n",
"\n",
"# Plot the faces\n",
"N_ax, N_img = 10, 10 # 10 rows with 10 images per row\n",
Expand Down Expand Up @@ -862,7 +862,7 @@
"source": [
"# Sampling\n",
"samples = np.random.uniform(0, 255, (100, 62, 47)).astype(np.uint8)\n",
"# sampling unifirmly 100 points in 62x47 space.\n",
"# sampling uniformly 100 points in 62x47 space.\n",
"\n",
"# Plot the faces\n",
"N_ax, N_img = 10, 10 # 10 rows with 10 images per row\n",
Expand All @@ -884,7 +884,7 @@
},
"source": [
"### The probability of hitting a face is very small!\n",
"The phenomenon is also know as..."
"The phenomenon is also known as..."
]
},
{
Expand Down Expand Up @@ -1002,9 +1002,9 @@
"source": [
"### Distances in high dimensional space\n",
"\n",
"Distances in high dimension follow a pattern:\n",
"Distances in high dimensions follow a pattern:\n",
"\n",
"- Distance of points sampled on the unit sphere goes to zero as D increases.\n",
"- The distance of points sampled on the unit sphere goes to zero as D increases.\n",
"- Euclidean distance of points goes to $4\\sqrt{D}$.\n",
"- The variance of distances between two random points keeps shrinking as D increases.\n",
"\n",
Expand Down Expand Up @@ -1331,7 +1331,7 @@
}
],
"source": [
"############### Fittin with 3 component ######\n",
"############### Fittin with 3 components ######\n",
"pca = PCA(n_components=3) # retain 3 components\n",
"pca.fit(faces.data)\n",
"#############################################\n",
Expand Down Expand Up @@ -1474,7 +1474,7 @@
"source": [
"# Why compression\n",
"\n",
"This example is just used for illustrative purpose since **Machine Learning** is much related to **Information Theory**.\n",
"This example is just used for illustrative purposes since **Machine Learning** is much related to **Information Theory**.\n",
"\n",
"<img width=\"50%\" src=\"figs/compression_net.png\" />"
]
Expand Down Expand Up @@ -1609,7 +1609,7 @@
}
},
"source": [
"# Any question of previous lectures before moving on?\n",
"# Any questions of previous lectures before moving on?\n",
"\n",
"- We will review a few concept of PCA at the end of matrix calculus"
]
Expand All @@ -1624,7 +1624,7 @@
"source": [
"# Matrix Calculus\n",
"\n",
"Part of this lectures are taken from:\n",
"Part of these lectures are taken from:\n",
"\n",
"- [CS229 LinAlg](http://cs229.stanford.edu/summer2019/cs229-linalg.pdf)\n",
"- [CS229 Calculus Recap](https://www.youtube.com/watch?v=b0HvwszmqcQ)\n",
Expand Down Expand Up @@ -1698,7 +1698,7 @@
"\n",
"\n",
"- Let's take a complex function $$f(x) = \\sin(x^x)$$ over the $[0, 3]$. \n",
"- Its behaviour is not simple to understand."
"- Its behavior is not simple to understand."
]
},
{
Expand Down Expand Up @@ -2059,7 +2059,7 @@
"$$\\mbf{x}^T \\mbf{A}\\mbf{x}$$\n",
"\n",
"- It is vector to scalar function\n",
"- It used for characterizing **Definiteness** of matrices"
"- It is used for characterizing **Definiteness** of matrices"
]
},
{
Expand Down

0 comments on commit cab835c

Please sign in to comment.