Swin-Transformer
Swin-Transformer copied to clipboard
Could I compute relative_position_bias in __init__() instead of forward() ?
Hi,
the relative_position_bias
seems to be a constant index of self.relative_position_bias_table
, why should it be computed in forward() every time?
For the code, why the following code can't be computed in WindowAttention.__init__()
?
relative_position_bias = self.relative_position_bias_table[self.relative_position_index.view(-1)].view(
self.window_size[0] * self.window_size[1], self.window_size[0] * self.window_size[1], -1) # Wh*Ww,Wh*Ww,nH
relative_position_bias = relative_position_bias.permute(2, 0, 1).contiguous() # nH, Wh*Ww, Wh*Ww
Thank you for your time to solve my confusion~
I have the same question. relative_position_bias
are selected by indexes from a predefined tensor, also it's shape won't change during training