vector-quantize-pytorch icon indicating copy to clipboard operation
vector-quantize-pytorch copied to clipboard

LatentQuantize exploding loss

Open jbmaxwell opened this issue 1 year ago • 2 comments
trafficstars

I'm trying to use the LatentQuantize model in an autoencoder context. My inputs are flat 1-d tensors (32) and my encoder passes a shape of (batch_size, 64) to the quantizer. For now, my "levels" is [8, 6, 4], my latent_dim is 64:

self.lq = LatentQuantize(
            levels=levels,
            dim=latent_dim,
            commitment_loss_weight=0.1,
            quantization_loss_weight=0.1,
        )

The loss starts at zero, then exponentially increases: Screenshot 2024-08-01 at 1 53 43 PM

Any thoughts as to why this might happen?

jbmaxwell avatar Aug 01 '24 20:08 jbmaxwell