FNet-pytorch icon indicating copy to clipboard operation
FNet-pytorch copied to clipboard

Masking/padding tokens in sequences ?

Open d5555 opened this issue 2 years ago • 1 comments

How can we mask/pad tokens for sequences of varied length ? When we use fft along dimension (-2) , sequences , if we just use zero padding the result will be skewed. torch.fft.fft(torch.fft.fft(hidden_states.float(), dim=-1), dim=-2).real

d5555 avatar Jul 29 '22 23:07 d5555

Seconding this. Would be good to know how we should deal with this

stevesmit avatar Sep 04 '22 14:09 stevesmit