CACNet-Pytorch
CACNet-Pytorch copied to clipboard
i find the reason why the compostiion acc is lower when train end2end
hi @bo-zhang-cs
when test your release model, i find that the acc is lower than the paper, which is 73%. and i try to figure out how to fix this problem. after doing multi-experiments, i i have found the reason why the compostiion acc is lower when train end2end. the gradient of cropping branch back to backbone is not good for training composition branch. so the solution is that, use detach operation to stop the gradient of cropping branch loss
f4_detach = f4.detach()
offsets = self.cropping_module(f4_detach)
the result i get is that epoch: 38, FCDB_iou:0.7032, FCDB_disp: 0.0728, FLMS_iou: 0.8441, FLMS_disp: 0.0360, Acc: 88.54%
Hi @dongdk: Wow, what a fantastic discovery! 🎉 Your solution to detach the gradient from the cropping branch is brilliant—very insightful! Thanks for sharing your findings and solution—this will definitely help move things forward! 🚀