FNet-pytorch
FNet-pytorch copied to clipboard
Masking/padding tokens in sequences ?
How can we mask/pad tokens for sequences of varied length ? When we use fft along dimension (-2) , sequences , if we just use zero padding the result will be skewed. torch.fft.fft(torch.fft.fft(hidden_states.float(), dim=-1), dim=-2).real
Seconding this. Would be good to know how we should deal with this