Swin UNETR pretrained model weight has keys not compatible of load_from in fine-tuning task
Describe the bug
Swin UNETR pre-trained model weight has keys not compatible of load_from in fine-tuning task
To Reproduce Steps to reproduce the behavior:
- pre-train the Swin UNETR model
- call load_from using the pretrained model weights
- Will get error as the model weight keys are not compatible
This is the saved pretrained keys, it starts with swin_vit:

But the load_from method requires weight keys to start with module: https://github.com/Project-MONAI/MONAI/blob/edf3b742a4ae85d1f30462ed0c7511c520fae888/monai/networks/nets/swin_unetr.py#L232
We can write an extra convert script to convert the pre-trained model to the desired format, but it is more convenient if the pre-trained model is saved as desired when pretraining.
Please make sure to use the exact version of monai as described in the dependencies. Pre-trained weights are compatible if the right monai version is used.
The pre-training utilizes ddp, hence module is added to weights name. If you use singe GPU for pre-training, which is not recommended, then the weight names need to be adjusted accordingly by the user.