SoftTeacher icon indicating copy to clipboard operation
SoftTeacher copied to clipboard

Queries on Weights for Classification Loss on Background Boxes

Open anuj-sharma-19 opened this issue 3 years ago • 7 comments

Hi,

First of all congratulations for the great work and many thanks for sharing the code !!

I have been checking out the code, and have a query regarding the weight for classification loss on background proposals.

The classification loss on pseudo labels seems to be done here https://github.com/microsoft/SoftTeacher/blob/bef9a256e5c920723280146fc66b82629b3ee9d4/ssod/models/soft_teacher.py#L240

where the bbox_targets are computed a couple of lines earlier here https://github.com/microsoft/SoftTeacher/blob/bef9a256e5c920723280146fc66b82629b3ee9d4/ssod/models/soft_teacher.py#L215

which I understand refers to the bbox_head.py in mmdetection here https://github.com/open-mmlab/mmdetection/blob/bde7b4b7eea9dd6ee91a486c6996b2d68662366d/mmdet/models/roi_heads/bbox_heads/bbox_head.py#L183

which further calls _get_target_single() here https://github.com/open-mmlab/mmdetection/blob/bde7b4b7eea9dd6ee91a486c6996b2d68662366d/mmdet/models/roi_heads/bbox_heads/bbox_head.py#L117

But in this, the negative proposals are assigned weight of 1.0, which should have been cls_score for the proposals after running them through the Teacher, as mentioned in the paper.

Maybe I am missing something in the code. It would be really great if you could kindly clarify the above query or point me to where it uses the cls-score from the teacher into the classification loss for background proposals.

Thank You !!

Best Regards, Anuj

anuj-sharma-19 avatar Jan 07 '22 11:01 anuj-sharma-19

Okay, I can see it now where it is updated to background cls-score from the teacher https://github.com/microsoft/SoftTeacher/blob/bef9a256e5c920723280146fc66b82629b3ee9d4/ssod/models/soft_teacher.py#L235

anuj-sharma-19 avatar Jan 07 '22 11:01 anuj-sharma-19

In the final classification loss here https://github.com/microsoft/SoftTeacher/blob/bef9a256e5c920723280146fc66b82629b3ee9d4/ssod/models/soft_teacher.py#L243

it seems loss is:

# where `w` is the cls-score from teacher for background proposal
total_cls_loss = sum(cls_loss_on_all_fg_proposals) + sum(w * cls_loss_on_bg_proposals)
avg_factor = N_fg + sum(w's on bg proposals)
loss_cls = total_cls_loss / avg_factor

So, avg_factor is not as N_fg in the paper. Also, the w_j as used in the paper (Eq 5) does not seem to be used in the code.

It would be really great if you could you please clarify these couple of doubts.

Thank You !!

Anuj

anuj-sharma-19 avatar Jan 07 '22 11:01 anuj-sharma-19

Yes, you are right. The typo should have been fixed in the paper but I am not sure why it isn't. I will discuss this with my collaborators.

MendelXu avatar Jan 07 '22 12:01 MendelXu

Great. Thanks a lot for the quick clarifications !! :+1:

anuj-sharma-19 avatar Jan 07 '22 12:01 anuj-sharma-19

Very good and important question. I'm curious about the relationship between weight setting and the final mAP performance. This can also benefit my work. I comment here simply to get the latest feedback.

jackhu-bme avatar Jan 09 '22 08:01 jackhu-bme

@anuj-sharma-19 @Jack-Hu-2001 The correct equation, as I understand it, is: image @MendelXu please check it.

lliuz avatar Jan 14 '22 14:01 lliuz

@lliuz Hi, yes, the above equation matches the code correctly.

anuj-sharma-19 avatar Jan 18 '22 15:01 anuj-sharma-19