-
Notifications
You must be signed in to change notification settings - Fork 333
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[zh] cs-229-unsupervised-learning #48
Conversation
complete all sentences.
Awesome work @tigerneil! |
OK👌 |
|
||
<br> | ||
|
||
32. **Eigenvalue, eigenvector ― Given a matrix A∈Rn×n, λ is said to be an eigenvalue of A if there exists a vector z∈Rn∖{0}, called eigenvector, such that we have:** | ||
|
||
⟶ | ||
⟶ 特征值,特征向量 - 给定矩阵 A∈Rn×n,λ 被称为 A 的一个特征值当存在一个称为特征向量的向量 z∈Rn∖{0},使得: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
给定矩阵A∈Rn×n,如果存在称为本征向量的向量z∈Rn∖{0},则称λ为A的特征值,这样我们就得到:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里还是保持原样比较一致。
|
||
<br> | ||
|
||
36. **Algorithm ― The Principal Component Analysis (PCA) procedure is a dimension reduction technique that projects the data on k dimensions by maximizing the variance of the data as follows:** | ||
|
||
⟶ | ||
⟶ 算法 - 主成分分析(PCA)过程就是一个降维技巧,通过最大化数据的方差而将数据投影到 k 维上: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
将数据投影到 k 维上 -> 来投影k维数据
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
保持原样是正确含义。
|
||
<br> | ||
|
||
46. **The goal is to find the unmixing matrix W=A−1.** | ||
|
||
⟶ | ||
⟶ 目标是要找到去混合矩阵 W=A−1。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
去混合矩阵 -> 解混矩阵
|
||
<br> | ||
|
||
35. **Remark: the eigenvector associated with the largest eigenvalue is called principal eigenvector of matrix A.** | ||
|
||
⟶ | ||
⟶ 注:关联于最大的特征值的特征向量被称为矩阵 A 的主特征向量。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
注:最大的特征值对应的特征向量被称为矩阵 A 的主特征向量。
|
||
<br> | ||
|
||
38. **Step 2: Compute Σ=1mm∑i=1x(i)x(i)T∈Rn×n, which is symmetric with real eigenvalues.** | ||
|
||
⟶ | ||
⟶ 步骤 2:计算 Σ=1mm∑i=1x(i)x(i)T∈Rn×n,其为有实特征值的对称阵 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
步骤 2:计算 Σ=1mm∑i=1x(i)x(i)T∈Rn×n,它是对称阵,特征值是实数。
No description provided.