CACNet-Pytorch icon indicating copy to clipboard operation
CACNet-Pytorch copied to clipboard

i find the reason why the compostiion acc is lower when train end2end

Open dongdk opened this issue 1 year ago • 1 comments

hi @bo-zhang-cs when test your release model, i find that the acc is lower than the paper, which is 73%. and i try to figure out how to fix this problem. after doing multi-experiments, i i have found the reason why the compostiion acc is lower when train end2end. the gradient of cropping branch back to backbone is not good for training composition branch. so the solution is that, use detach operation to stop the gradient of cropping branch loss

            f4_detach = f4.detach()
            offsets = self.cropping_module(f4_detach)

the result i get is that epoch: 38, FCDB_iou:0.7032, FCDB_disp: 0.0728, FLMS_iou: 0.8441, FLMS_disp: 0.0360, Acc: 88.54%

dongdk avatar Oct 14 '24 11:10 dongdk

Hi @dongdk: Wow, what a fantastic discovery! 🎉 Your solution to detach the gradient from the cropping branch is brilliant—very insightful! Thanks for sharing your findings and solution—this will definitely help move things forward! 🚀

bo-zhang-cs avatar Oct 14 '24 14:10 bo-zhang-cs