pytorch-vq-vae icon indicating copy to clipboard operation
pytorch-vq-vae copied to clipboard

PyTorch implementation of VQ-VAE by Aäron van den Oord et al.

Results 6 pytorch-vq-vae issues
Sort by recently updated
recently updated
newest added

Hi, thanks for your clean implementation ! I was wondering, have you ever tried to calculate the bits / dimension metric (as in the original paper) ? I've tried to...

By updating all embeddings regardless of if they are being used you are decaying them towards 0. Is this intended? _I tried removing the decay but it seems to decrease...

Hi everybody, looking at the VectorQuantizerEMA nn.Module in the code, I was not able to understand how the codebook vectors are updated after initialization. Is there a way to force...

`` def forward(self, inputs): # convert inputs from BCHW -> BHWC inputs = inputs.permute(0, 2, 3, 1).contiguous() input_shape = inputs.shape # Flatten input flat_input = inputs.view(-1, self._embedding_dim) `` My unders...

Hi, I can't figure out why we need to change from BCHW to BHWC before we flatten. I would be happy if you could explain this moment. Thank you! #...

Hi! Thank you for the great upload. How exactly can I extract the latent code of an image? By that I mean the code of size i.e [1,8,8] and not...