Meta-DETR icon indicating copy to clipboard operation
Meta-DETR copied to clipboard

Performing inference with CPU

Open alexanderDuenas opened this issue 9 months ago • 1 comments

thanks for your code!

I have two questions:

  1. Its possible to perform inference just with CPU?
  2. Why in the inference process the support images are also used. In the code the outputs are: outputs = model(img, supp_class_ids=support_class_ids_final, category_codes=all_category_codes_final) its necessary the arguments supp_class_ids and category_codes.?

alexanderDuenas avatar May 13 '24 02:05 alexanderDuenas

  1. No, the codebase does not support that. But I think it should be pretty easy to implement that by yourself.

  2. Because for inference, we still need prototypes for all classes. During inference, all_category_codes_final are computed by averaging all support instances for each of the given class.

ZhangGongjie avatar May 17 '24 02:05 ZhangGongjie