MaskGIT-pytorch icon indicating copy to clipboard operation
MaskGIT-pytorch copied to clipboard

sample_good() function in transformer.py

Open jeeyung opened this issue 2 years ago • 0 comments

Hi!

I think the shape of logits from self.tokens_to_logits is [batch, 257, 1026] because you defined self.tok_emb = nn.Embedding(args.num_codebook_vectors + 2, args.dim).

However, the number of codebook's embedding is 1024 so that it occurs errors. Haven't you seen these errors during sampling? Did I miss something here?

jeeyung avatar Feb 22 '23 15:02 jeeyung