Transformer-SSL
Transformer-SSL copied to clipboard
How to load a checkpoint when using the swin transformer as the backbone in a Mask-RCNN model
Hi there, I've already trained my swin transformer with your proposed SSL method and have the checkpoints saved. I'm now trying to load my model as the backbone of a mask-rcnn model (also your mmdetection implementation from the other repository). However, I'm getting the following error.
KeyError: 'encoder.layers.0.blocks.0.attn.relative_position_bias_table'
I guess that just requires a naming conversion. I was wondering if you have the script to do so for all layers? Thanks,