Metalis
Results
2
comments of
Metalis
I was able to get my finetuned checkpoint working by adding 'detector.' prefix to model["model"] state dict. ```python import torch from collections import OrderedDict checkpoint = "checkpoint.pt" wrapped_model = torch.load(checkpoint,...
@SpiderJack0516 Yes, it's for segmentation finetune. Copying the missing keys from official model worked for me way back when I needed it. ```python import torch from collections import OrderedDict #...