taming-transformers icon indicating copy to clipboard operation
taming-transformers copied to clipboard

Confusion about the dimension of latent vectors

Open LiJiahao-Alex opened this issue 1 month ago • 0 comments

Hi.

There is one output in the decoder class: print("Working with z of shape {} = {} dimensions.".format(self.z_shape, np.prod(self.z_shape))) Here the letter z seems to represent the latent vector by convention. But actually in the encoder class h stands for latent vector. What I don't understand is: why is z the bottleneck, but h is used as the latent vector?

Looking forward to your response Best

LiJiahao-Alex avatar May 30 '24 18:05 LiJiahao-Alex