tensorflow-generative-model-collections
tensorflow-generative-model-collections copied to clipboard
Question about marginal_likelihood in VAE
Thank you for your working and sharing. I learned a lot from them.
However, I have a question about the VAE implementation.
In VAE.py, you calculate the marginal_likelihood as an cross entropy:
marginal_likelihood = tf.reduce_sum(self.inputs * tf.log(self.out) + (1 - self.inputs) * tf.log(1 - self.out),[1, 2])
However, I am confusing as the formular is :
where the first item on the right side should correspond to the marginal_likelihood. I think the latent variable z should be included to calculate the likelihood but you use the initial input, self.inputs.
So I am a little confusing, can you explain it?
Thank you very much!
Hi.
There are two tricks on implementation of marignal likelihood. 1st is to use Monte-carlo technique instead of integral. 2nd is to use only one sample for Monte-carlo technique.
With these tricks, marginal likelihood is approximated to log(p(x_i|z)). Since p(x_i|z) is assumed to follow bernoulli distribution, log(p(x_i|z)) is cross entropy between network input and output.
Please check following.
@hwalsuklee Hi, thanks for the explanation. I understand it. I have another question. There is a KL gap between the ELBO and p(x), so how can I compute the p(x)? Thank you!
Dear @hwalsuklee: do you know how negative log-likelihood for f.ex Cifar test set is reported in VAE-like models? Paper examples etc are included in my question here: https://www.reddit.com/r/MLQuestions/comments/9sp6d3/how_to_calculate_log_likelihood_for_vaevqvae/