fnet icon indicating copy to clipboard operation
fnet copied to clipboard

Masking/padding tokens in sequences of varied length ?

Open d5555 opened this issue 2 years ago • 0 comments

How can we mask/pad tokens for sequences of varied length ? When we use fft along dimension (-2) , sequences , if we just use zero padding the result will be skewed. torch.fft.fft2(x, dim=(-1, -2)).real

d5555 avatar Jul 30 '22 02:07 d5555