-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about l2 normalization of input #9
Comments
I wonder that also. Since it is a DAE, and I don't see any noise being added to the input, I think the noise is the L2 normalization itself? |
The noise is added by using dropout. |
Can you explain the L2 normalization in other words please? I didnt understand. |
Here 's what I think, I have an unrelated question. When using Gaussian, the log likelihood contains confidence c_ij, but not in multinomial likelihood. Can you explain why multinomial doesn't need c_ij? |
Your explanation makes sense to me. In this case where you have your inputs binary, normalization does not help. Regarding the other issue, can you point it in the code? |
In VAE CF paper, the Gaussian log likelihood (Eq 3) contains confidence weight c_ui but not in the Multinomial log likelihood (Eq 2). Can you explain why multinomial doesn't need confidence weight? And I think in case of binary inputs, your feedback still can be distorted. |
@jin530 In my opinion,both DAE and VAE ,the author used the structure of denosing ,that is L2 and dropoout,added Bernoulli noise to the input data |
Hi, I have a question about l2 normalization of input.
At q_graph method in Multi_VAE and forward_pass method in Multi_DAE, why do you apply l2 normalization to input vector?
I don't know the meaning of that normalization.
Sorry to bother you, Thank you!
The text was updated successfully, but these errors were encountered: