pytorch-distributed icon indicating copy to clipboard operation
pytorch-distributed copied to clipboard

关于损失backward问题

Open menghuanlater opened this issue 4 years ago • 1 comments

作者大大您好,为何代码中计算梯度的时候用的是loss.backward()而不是reduce_loss.backward() ?

menghuanlater avatar Dec 21 '20 10:12 menghuanlater

同问

MachineVision123 avatar Feb 28 '21 12:02 MachineVision123