-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
about entorpy #9
Comments
Hi, |
In the original paper (
https://papers.nips.cc/paper/6125-improved-techniques-for-training-gans.pdf
on page 4), it is ‘entropy(pyx, py)’, or the KL divergence between p(y|x)
and p(y).
Shane
…On Sun, Dec 22, 2019 at 12:33 AM kmaeii ***@***.***> wrote:
hi.
Thank you for share your code.
But I try it with some toy data.
I found out, If we want to assign a high score to the data which has
uniform P(y) and skew P(yx), we should use entropy(py, pyx), instead of
entropy(pyx, py).
And here is the code from openai:
kl = part * (np.log(part) - np.log(np.expand_dims(np.mean(part, 0), 0)))
https://github.com/openai/improved-gan/blob/master/inception_score/model.py
Hi,
I have the same confuse with you, so, should i replace the 'entropy(py,
pyx)' to 'kl = part * (np.log(part) - np.log(np.expand_dims(np.mean(part,
0), 0)))'. In my opinion, this is not equal.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#9?email_source=notifications&email_token=AB7LUGM2PPKMXKWMQAQSK5TQZYLLHA5CNFSM4FN5IFL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEHO4GGI#issuecomment-568181529>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AB7LUGKT6YZNYCVY3INHRN3QZYLLHANCNFSM4FN5IFLQ>
.
|
scipy.stats.entropy uses the KL divergence if two distributions are given. "If qk is not None, then compute the Kullback-Leibler divergence" |
hi.
Thank you for share your code.
But I try it with some toy data.
I found out, If we want to assign a high score to the data which has uniform P(y) and skew P(yx), we should use entropy(py, pyx), instead of entropy(pyx, py).
And here is the code from openai:
kl = part * (np.log(part) - np.log(np.expand_dims(np.mean(part, 0), 0)))
https://github.com/openai/improved-gan/blob/master/inception_score/model.py
The text was updated successfully, but these errors were encountered: