Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quenstion about get_L2norm_loss_self_drive #13

Open
kevinbro96 opened this issue Aug 18, 2020 · 2 comments
Open

Quenstion about get_L2norm_loss_self_drive #13

kevinbro96 opened this issue Aug 18, 2020 · 2 comments

Comments

@kevinbro96
Copy link

Hi there. I wonder why we should the L2 norm the feature 'x' first?
The L2 norm of feature 'x' will always be 1, and the true norm of feature 'x' is lost.

@jihanyang
Copy link
Owner

#4

@kevinbro96
Copy link
Author

你好,谢谢你的回答,但是#4和我问的没有什么关系。我想问为什么要先把特征x先做L2norm后再计算loss,x.norm(p=2, dim=1),这样不是会丢失特征x的真实模值吗?而且,就算不把x先做L2norm,也可以一样达到你想要的loss只和\Delta r有关的效果。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants