mmselfsup icon indicating copy to clipboard operation
mmselfsup copied to clipboard

DenseCL weight initialization inconsistent with original code.

Open lorinczszabolcs opened this issue 2 years ago • 0 comments

Checklist

  1. I have searched related issues but cannot get the expected help. ✅
  2. I have read the FAQ documentation but cannot get the expected help. ✅

Hi!

I noticed, that the weight initialization in DenseCL is inconsistent with the original code, namely that when pre-trained weights are loaded, only the query encoder will get initialized, but the weights won't be copied to the key encoder, while in the original repo it is done like that, see https://github.com/WXinlong/DenseCL/blob/360df04ba25aca36d42e85c5a96552783568fccf/openselfsup/models/densecl.py#L36 and https://github.com/WXinlong/DenseCL/blob/360df04ba25aca36d42e85c5a96552783568fccf/openselfsup/models/densecl.py#L51-L58.

I think DenseCL's init_weights should be overrided by having

def init_weights(self):
    """Init weights and copy query encoder init weights to key encoder."""
    super().init_weights()
    for param_q, param_k in zip(self.encoder_q.parameters(),
                                self.encoder_k.parameters()):
        param_k.data.copy_(param_q.data)
        param_k.requires_grad = False

lorinczszabolcs avatar Aug 08 '22 07:08 lorinczszabolcs