inception-score-pytorch icon indicating copy to clipboard operation
inception-score-pytorch copied to clipboard

about entorpy

Open albb762 opened this issue 6 years ago • 3 comments

hi. Thank you for share your code. But I try it with some toy data. I found out, If we want to assign a high score to the data which has uniform P(y) and skew P(yx), we should use entropy(py, pyx), instead of entropy(pyx, py). And here is the code from openai: kl = part * (np.log(part) - np.log(np.expand_dims(np.mean(part, 0), 0))) https://github.com/openai/improved-gan/blob/master/inception_score/model.py

albb762 avatar Aug 05 '18 09:08 albb762

hi. Thank you for share your code. But I try it with some toy data. I found out, If we want to assign a high score to the data which has uniform P(y) and skew P(yx), we should use entropy(py, pyx), instead of entropy(pyx, py). And here is the code from openai: kl = part * (np.log(part) - np.log(np.expand_dims(np.mean(part, 0), 0))) https://github.com/openai/improved-gan/blob/master/inception_score/model.py

Hi, I have the same confuse with you, so, should i replace the 'entropy(py, pyx)' to 'kl = part * (np.log(part) - np.log(np.expand_dims(np.mean(part, 0), 0)))'. In my opinion, this is not equal.

kmaeii avatar Dec 21 '19 13:12 kmaeii

In the original paper ( https://papers.nips.cc/paper/6125-improved-techniques-for-training-gans.pdf on page 4), it is ‘entropy(pyx, py)’, or the KL divergence between p(y|x) and p(y).

Shane

On Sun, Dec 22, 2019 at 12:33 AM kmaeii [email protected] wrote:

hi. Thank you for share your code. But I try it with some toy data. I found out, If we want to assign a high score to the data which has uniform P(y) and skew P(yx), we should use entropy(py, pyx), instead of entropy(pyx, py). And here is the code from openai: kl = part * (np.log(part) - np.log(np.expand_dims(np.mean(part, 0), 0))) https://github.com/openai/improved-gan/blob/master/inception_score/model.py

Hi, I have the same confuse with you, so, should i replace the 'entropy(py, pyx)' to 'kl = part * (np.log(part) - np.log(np.expand_dims(np.mean(part, 0), 0)))'. In my opinion, this is not equal.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/sbarratt/inception-score-pytorch/issues/9?email_source=notifications&email_token=AB7LUGM2PPKMXKWMQAQSK5TQZYLLHA5CNFSM4FN5IFL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEHO4GGI#issuecomment-568181529, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB7LUGKT6YZNYCVY3INHRN3QZYLLHANCNFSM4FN5IFLQ .

sbarratt avatar Dec 21 '19 21:12 sbarratt

scipy.stats.entropy uses the KL divergence if two distributions are given.

"If qk is not None, then compute the Kullback-Leibler divergence" You can check the document below, https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.entropy.html

bomtorazek avatar May 12 '21 12:05 bomtorazek