vector-quantize-pytorch icon indicating copy to clipboard operation
vector-quantize-pytorch copied to clipboard

Vector (and Scalar) Quantization, in Pytorch

Results 61 vector-quantize-pytorch issues
Sort by recently updated
recently updated
newest added
trafficstars

``` import torch from vector_quantize_pytorch import VectorQuantize vq = VectorQuantize( dim = 256, codebook_size = 512, # codebook size decay = 0.8, # the exponential moving average decay, lower means...

In the examples, you report `f"active %: {indices.unique().numel() / num_codes * 100:.3f}"`, for example: https://github.com/lucidrains/vector-quantize-pytorch/blob/master/examples/autoencoder_fsq.py#L76 But this does not generalize when the `num_codebooks` parameters is set to 2 or more....

Hi all, I want to ask a question regarding some concerns I got looking at the usage of the gumbel_sample method when `reinmax=False`. https://github.com/lucidrains/vector-quantize-pytorch/blob/6102e37efefefb673ebc8bec3abb02d5030dd933/vector_quantize_pytorch/vector_quantize_pytorch.py#L472-L472 First, this [sampling technique is mathematically...

this adds some slightly confusing masking code, but improves speed by 3x by making the shape of intermediate tensors non-dynamic. The masked_mean code is equivalent, up to fp precision, with...

Hi, I am trying to get quantized by saved codebook and index, but self.codebooks will always get 0. How can I do it? ``` import torch from vector_quantize_pytorch import ResidualVQ...

Hi, I want to train a Residual LFQ model for audio, and this is my core code: ``` def _loss_fn(loss_fn, x_target, x_pred, cfg, padding_mask=None): if padding_mask is not None: padding_mask...

Hi, I tried to use both codebook loss and commitment loss instead of EMA update, but I was confused about how to use codebook loss. If 'learable_codebook' is True, then...

Hi, I just wonder is it possible to adopt cosine distance instead of Eu distance in Residual VQ? Have you ever tried this before?

hi, i am training VQ-VAE-1 on my own dataset, the result was fine, but the commitment loss become very large suddenly at epoch 115 while the picture reconstructed become meaningful...