nvdm icon indicating copy to clipboard operation
nvdm copied to clipboard

Bug in perplexity calculation?

Open prashantserai opened this issue 4 years ago • 0 comments

Hi! So the perplexity calculation here is (per line 140 from "train" in nvdm.py): print_ppx = np.exp(loss_sum / word_count)

However, loss_sum is based on the sum of "loss" which is the result of "model.objective" i.e. the sum of reconstruction loss (cross-entropy) and K-L Divergence. Lines 129-132 from "train" in nvdm.py

_, (loss, kld) = sess.run((optim, 
                                    [model.objective, model.kld]),
                                    input_feed)
          loss_sum += np.sum(loss)

Line 78 the model definition in nvdm.py self.objective = self.recons_loss + self.kld

I thought Perplexity is usually the exponentiated form of the normalized cross-entropy, so is there a technical reason for using the result of model.objective instead of model.recons_loss to calculate the perplexity or is that a bug? I bet numbers should only get better if this is corrected (as KL Divergence is non-negative)

prashantserai avatar Jun 14 '20 16:06 prashantserai