Dive-into-DL-PyTorch icon indicating copy to clipboard operation
Dive-into-DL-PyTorch copied to clipboard

6.5 循环神经网络的简洁实现中调用梯度裁剪grad_clipping问题

Open alqbib opened this issue 4 years ago • 0 comments

6.5 循环神经网络的简洁实现中调用梯度裁剪方法: d2l.grad_clipping(model.parameters(), clipping_theta, device) 传入的model.parameters()是迭代器,在grad_clipping方法里的第一次for循环迭代中已迭代完,在第二次for循环迭代中不会重新迭代. def grad_clipping(params, theta, device): norm = torch.tensor([0.0], device=device) for param in params: ### # 第一次循环迭代, 将params迭代完 norm += (param.grad.data ** 2).sum() norm = norm.sqrt().item() if norm > theta: for param in params: # params已迭代完,不会重新开始,无法再进入后面的赋值 param.grad.data *= (theta / norm)

应改为d2l.grad_clipping(list(model.parameters()), clipping_theta, device)

alqbib avatar Mar 03 '20 05:03 alqbib