Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

修正了一些符号 #47

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions 计算机视觉/21_手动推导反向传播公式BP.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ $$

首先,根据神经网络误差函数的定义式,我们可以很容易地求出输出层的delta误差 $$\delta^L$$:
$$
\delta^L = \frac{\partial C}{\partial z^L} = \frac{\partial C}{\partial a^L} \frac{\partial a^L}{\partial z^L} = (a^L-y) \odot \delta'(z^L)
\delta^L = \frac{\partial C}{\partial z^L} = \frac{\partial C}{\partial a^L} \frac{\partial a^L}{\partial z^L} = (a^L-y) \odot \sigma'(z^L)
$$
公式中的 $$\odot$$ 表示 Hardmard 积,即对应逐元素相乘。注意输出层的 delta 误差 $$\delta^L$$ 与损失函数的定义相关,不同的损失函数得到不同的计算结果,在本文中损失函数以均方误差为例讲解。

Expand All @@ -44,7 +44,7 @@ $$
$$
又:
$$
z^{l+1} = W^{l+!}a^l+b^{l+!} = W^{l+!}\sigma(z^l)+b^{l+!}
z^{l+1} = W^{l+1}a^l+b^{l+1} = W^{l+1}\sigma(z^l)+b^{l+1}
$$
因此:
$$
Expand Down