vector-quantize-pytorch
vector-quantize-pytorch copied to clipboard
LatentQuantize exploding loss
trafficstars
I'm trying to use the LatentQuantize model in an autoencoder context. My inputs are flat 1-d tensors (32) and my encoder passes a shape of (batch_size, 64) to the quantizer. For now, my "levels" is [8, 6, 4], my latent_dim is 64:
self.lq = LatentQuantize(
levels=levels,
dim=latent_dim,
commitment_loss_weight=0.1,
quantization_loss_weight=0.1,
)
The loss starts at zero, then exponentially increases:
Any thoughts as to why this might happen?