DINO
DINO copied to clipboard
Transfer learning issue
Hello,
I'm wondering how to fix the loading state_dict issue while trying to use pre-trained model. Once the num_classes of the custom dataset has changed from 91, size mismatches for transformer.decoder.class_embed. Is there any better way to implement the transfer learning?
Thanks, Felix
If your dataset has similar categories as COCO, you can load the corresponding class embedding. But overall class embedding is easy to learn, you can just randomly pick some class embedding in the 91 classes (i.e., the first n classes in 91) or just don't load this parameter.