Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

你的loss函数写错了 #4

Open
tedljw opened this issue Oct 16, 2019 · 0 comments
Open

你的loss函数写错了 #4

tedljw opened this issue Oct 16, 2019 · 0 comments

Comments

@tedljw
Copy link

tedljw commented Oct 16, 2019

官方的写法是
crossEntropy = -torch.log(torch.gather(inp, 1, target.view(-1, 1)).squeeze(1))
你漏掉了.squeeze(1),这样导致把应该mask掉的也带进来算了,mask那里就无效了。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant