generative-models icon indicating copy to clipboard operation
generative-models copied to clipboard

Vanilla Variational Auto Encoder

Open markusMM opened this issue 6 years ago • 0 comments

Hello WiseOdd,

I have recognise you vanilla VAE, which seems pretty neat despite of the fact that does not work as I remember a Gassian noise sparse model would work.

I have recently read 1606.05908 where VAE are explained quite good more or less. Threre is the PDF for X described as expectation value over $P(Z)$. Now, when you update your parameters of the Gaussian, normally the variance is a diagonal matrix of $\sigma^2$. So, of cause one could think about a variance for each point in the dimensionality of X, but I am not qute sure that is the proper common idea behind the variance of the Gaussian distribution in generative models.

$P(X) = \frac{1}{D_z} \sum_{\forall z \in Z} P(X|Z, \Theta) P(Z)$ using this annotation, the expectation for $\sigma^2$ based on the data would be something similar to $\sum_{\forall n \in N} \braket{X-\mu(z;\Theta)|X-\mu(z;\Theta)}$, right?

Maybe you can take a look on that code. Because your variance seems to appear as a matrix instead of a value or set of values for the Gaussian noise model.

So, did you thought explicitly about a full covariance matrix or was it just trial and error in this case?

If trying out expectation value(s) for the variance, let me know, what your experience is.

regards, Markus

markusMM avatar Feb 22 '19 11:02 markusMM