BDCN
BDCN copied to clipboard
why stop gradients on upsample
o1, o2, o3, o4, o5 = s1.detach(), s2.detach(), s3.detach(), s4.detach(), s5.detach() o11, o21, o31, o41, o51 = s11.detach(), s21.detach(), s31.detach(), s41.detach(), s51.detach()
why do you use detach here? you mean that VGG's parameter is freezen?
No, we don't stop the gradient of upsample. we just stop the gradient to the previous predictions which we suppose the prediction is the proper edge that layer predict.
@pkuCactus Is it necessary to detach()
? What will happen if we do not detach()
?