Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[zh] cs-229-unsupervised-learning #48

Closed
wants to merge 3 commits into from

Conversation

tigerneil
Copy link
Contributor

No description provided.

@tigerneil tigerneil changed the title Update cheatsheet-unsupervised-learning.md [zh] unsupervised learning Sep 26, 2018
complete all sentences.
@tigerneil tigerneil changed the title [zh] unsupervised learning [zh] Unsupervised Learning Sep 26, 2018
@tigerneil tigerneil changed the title [zh] Unsupervised Learning [zh] Unsupervised learning Sep 26, 2018
@shervinea shervinea added the reviewer wanted Looking for a reviewer label Sep 27, 2018
@shervinea
Copy link
Owner

Awesome work @tigerneil!
Now, let's wait for another native speaker to review the translation.

@tigerneil
Copy link
Contributor Author

OK👌


<br>

32. **Eigenvalue, eigenvector ― Given a matrix A∈Rn×n, λ is said to be an eigenvalue of A if there exists a vector z∈Rn∖{0}, called eigenvector, such that we have:**

&#10230;
&#10230; 特征值,特征向量 - 给定矩阵 A∈Rn×n,λ 被称为 A 的一个特征值当存在一个称为特征向量的向量 z∈Rn∖{0},使得:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

给定矩阵A∈Rn×n,如果存在称为本征向量的向量z∈Rn∖{0},则称λ为A的特征值,这样我们就得到:

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里还是保持原样比较一致。


<br>

36. **Algorithm ― The Principal Component Analysis (PCA) procedure is a dimension reduction technique that projects the data on k dimensions by maximizing the variance of the data as follows:**

&#10230;
&#10230; 算法 - 主成分分析(PCA)过程就是一个降维技巧,通过最大化数据的方差而将数据投影到 k 维上:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

将数据投影到 k 维上 -> 来投影k维数据

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

保持原样是正确含义。


<br>

46. **The goal is to find the unmixing matrix W=A−1.**

&#10230;
&#10230; 目标是要找到去混合矩阵 W=A−1。
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

去混合矩阵 -> 解混矩阵


<br>

35. **Remark: the eigenvector associated with the largest eigenvalue is called principal eigenvector of matrix A.**

&#10230;
&#10230; 注:关联于最大的特征值的特征向量被称为矩阵 A 的主特征向量。

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

注:最大的特征值对应的特征向量被称为矩阵 A 的主特征向量。


<br>

38. **Step 2: Compute Σ=1mm∑i=1x(i)x(i)T∈Rn×n, which is symmetric with real eigenvalues.**

&#10230;
&#10230; 步骤 2:计算 Σ=1mm∑i=1x(i)x(i)T∈Rn×n,其为有实特征值的对称阵

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

步骤 2:计算 Σ=1mm∑i=1x(i)x(i)T∈Rn×n,它是对称阵,特征值是实数。

@shervinea shervinea closed this in 1811d08 Feb 11, 2019
@shervinea shervinea changed the title [zh] Unsupervised learning [zh] cs-229-unsupervised-learning Oct 6, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
reviewer wanted Looking for a reviewer
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants