FcaNet icon indicating copy to clipboard operation
FcaNet copied to clipboard

dct_h and dct_w

Open myasser63 opened this issue 2 years ago • 5 comments

How can I set dct_h and dct_w if i want to add FCA layer into another model. My feature maps for the layer I want to inset Fca layer are 160x160, 80x80, 40x40, 20x20

Please advise.

myasser63 avatar Aug 27 '22 07:08 myasser63

@myasser63 You can directly add the FCA layer without any modification. The feature map's size would be addressed automatically as here: https://github.com/cfzd/FcaNet/blob/aa5fb63505575bb4e4e094613565379c3f6ada33/model/layer.py#L54-L55

cfzd avatar Aug 29 '22 02:08 cfzd

So I should leave dct_h and dct_w like that or set to feature maps sizes.

self.FCA = MultiSpectralAttentionLayer(in_channels, self.dct_h, self.dct_w)

myasser63 avatar Aug 30 '22 06:08 myasser63

Whatever you want. You can set it according to your preferences or use the settings as ours: https://github.com/cfzd/FcaNet/blob/aa5fb63505575bb4e4e094613565379c3f6ada33/model/fcanet.py#L19 https://github.com/cfzd/FcaNet/blob/aa5fb63505575bb4e4e094613565379c3f6ada33/model/fcanet.py#L29

cfzd avatar Aug 30 '22 07:08 cfzd

I am trying this way and getting this:

  self.FCA =  MultiSpectralAttentionLayer(c1, c2wh[c1], c2wh[c1])

Error: RuntimeError: adaptive_avg_pool2d_backward_cuda does not have a deterministic implementation, but you set 'torch.use_deterministic_algorithms(True)'. You can turn off determinism just for this operation, or you can use the 'warn_only=True' option, if that's acceptable for your application. You can also file an issue at https://github.com/pytorch/pytorch/issues

myasser63 avatar Aug 30 '22 07:08 myasser63

@myasser63 As the error says, it's a problem of adaptive_avg_pool2d. You can either just ignore it by:

torch.use_deterministic_algorithms(True, warn_only=True)

or you can turn off determinism by:

torch.use_deterministic_algorithms(False)

cfzd avatar Aug 30 '22 12:08 cfzd