simpletransformers
simpletransformers copied to clipboard
[Bug fixed] Change squeeze() to squeeze(0) to accomodate for sequence length of 1
Problem description: the current preprocessing function will squeeze length-1 sequences to a 0-dim tensor, which will cause an error in pytorch dataloaders. Solution: replaced each squeeze() with squeeze(0) in seq2seq_utils.py