PyTorch-VAE
PyTorch-VAE copied to clipboard
Negative value of STD (log_var) in latent representation.
I've noticed that on not training data the latent representation of my data after encoder brings negative values of standard deviation. Can you, please, explain why it can be possible? Thank you in advance!
As you said in the title, it is the logarithm of the variance. If the variance is too small (< 1) then log_var can be negative.
To further explain the we are using logvar is to avoid this problem of negative variance. The output of a NN can be negative, so we use the log of the variance. 0 < log(negative #) < 1
For further explanation do some reading on the reparameterization trick.