research-contributions icon indicating copy to clipboard operation
research-contributions copied to clipboard

Swin UNETR pretrained model weight has keys not compatible of load_from in fine-tuning task

Open upupming opened this issue 3 years ago • 2 comments

Describe the bug

Swin UNETR pre-trained model weight has keys not compatible of load_from in fine-tuning task

To Reproduce Steps to reproduce the behavior:

  1. pre-train the Swin UNETR model
  2. call load_from using the pretrained model weights
  3. Will get error as the model weight keys are not compatible

This is the saved pretrained keys, it starts with swin_vit:

d440b0d431ea87921e6546c430842cb

But the load_from method requires weight keys to start with module: https://github.com/Project-MONAI/MONAI/blob/edf3b742a4ae85d1f30462ed0c7511c520fae888/monai/networks/nets/swin_unetr.py#L232

We can write an extra convert script to convert the pre-trained model to the desired format, but it is more convenient if the pre-trained model is saved as desired when pretraining.

upupming avatar Jul 02 '22 06:07 upupming

Please make sure to use the exact version of monai as described in the dependencies. Pre-trained weights are compatible if the right monai version is used.

ahatamiz avatar Jul 02 '22 07:07 ahatamiz

The pre-training utilizes ddp, hence module is added to weights name. If you use singe GPU for pre-training, which is not recommended, then the weight names need to be adjusted accordingly by the user.

ahatamiz avatar Jul 02 '22 07:07 ahatamiz