convert_checkpoint_to_lsg
convert_checkpoint_to_lsg copied to clipboard
Index out of bounds for BART and other model architectures
I keep getting an index 50264 is out of bounds for dimension 0 with size 50264 or something similar when converting BART and some other models to LSG.
The issue seems to be this line of code in the update_global method -
positions[1:] += u[mask_id].unsqueeze(0)
Hi @Gimperion
Can you share your transformers version and a snippet of code you did use?
I replicated the error with both transformers 4.26.0 and 4.28.1. Here's the snippet:
from lsg_converter import LSGConverter
converter = LSGConverter(max_sequence_length=4096)
model, tokenizer = converter.convert_from_pretrained("sshleifer/distilbart-cnn-6-6", block_size=256, sparsity_factor=2)
I think I found the problem @Gimperion
Something is wrong with the model and the tokenizer.
The <mask> token has the index 50264 while the model config states that "vocab_size": 50264 .
Since the first token has the index 0, there are in practice 50265 tokens in the vocabulary, the index is thus out of bounds.
If you try to do an inference with the <mask> token, it fails.
If you really need to convert the model you have two possibilities:
- expand the token embedding matrix
- use
random_global_init=Trueor--random_global_initto skip the step with the mask token