torchbearer
torchbearer copied to clipboard
GradientNormClipping callback error
When I insert this callback in the trial I get the following error. Is this some kind of bug? It seems like the gradients are not passed in the callback.
""" File "/home/dimitris/.local/lib/python3.6/site-packages/torch/nn/utils/clip_grad.py", line 30, in clip_grad_norm_ total_norm = torch.norm(torch.stack([torch.norm(p.grad.detach(), norm_type) for p in parameters]), norm_type) RuntimeError: stack expects a non-empty TensorList """