simpletransformers icon indicating copy to clipboard operation
simpletransformers copied to clipboard

[Bug fixed] Change squeeze() to squeeze(0) to accomodate for sequence length of 1

Open ChenWu98 opened this issue 4 months ago • 0 comments

Problem description: the current preprocessing function will squeeze length-1 sequences to a 0-dim tensor, which will cause an error in pytorch dataloaders. Solution: replaced each squeeze() with squeeze(0) in seq2seq_utils.py

ChenWu98 avatar Oct 08 '24 22:10 ChenWu98