PyTorch-VAE icon indicating copy to clipboard operation
PyTorch-VAE copied to clipboard

Negative value of STD (log_var) in latent representation.

Open TimofeyKruk opened this issue 5 years ago • 2 comments

I've noticed that on not training data the latent representation of my data after encoder brings negative values of standard deviation. Can you, please, explain why it can be possible? Thank you in advance!

TimofeyKruk avatar Nov 11 '20 14:11 TimofeyKruk

As you said in the title, it is the logarithm of the variance. If the variance is too small (< 1) then log_var can be negative.

AntixK avatar Nov 12 '20 03:11 AntixK

To further explain the we are using logvar is to avoid this problem of negative variance. The output of a NN can be negative, so we use the log of the variance. 0 < log(negative #) < 1

For further explanation do some reading on the reparameterization trick.

botkevin avatar Dec 01 '20 23:12 botkevin