iMTFA
iMTFA copied to clipboard
Training on ImageNet to compare with Siamese-Mask-RCNN
Proposed neural networks MTFA/iMTFA are compared in the paper with Siamese Mask-RCNN. But, in order to compare, nets in those experiments train 1K ImageNet classes, which violates few-shot conventions a little. At the same time, a list of 687 classes (1000 ImageNet w/o COCO) and 771 (1000 ImageNet w/o Pascal-VOC) were uploaded by this link at bethgelab/siamese-mask-rcnn/data/.
Moreover, weights of models trained with 687 and 771 classes were released at this link bethgelab/siamese-mask-rcnn/releases.
Additional experiments with training MTFA starting from (687 / 771)-backbone will reveal the difference between using base classes on the pretrain stage and a fully-pretrained feature maps predictor.