GradNorm icon indicating copy to clipboard operation
GradNorm copied to clipboard

Nonetype

Open xinjiTian opened this issue 1 year ago • 0 comments

File "main_simmim_pt.py", line 302, in train_one_epoch G1R = torch.autograd.grad(L1, param[0].clone(), retain_graph=True, create_graph=True) File "D:\txj\envs\swin2\lib\site-packages\torch\autograd_init_.py", line 236, in grad inputs, allow_unused, accumulate_grad=False) RuntimeError: One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior.

xinjiTian avatar Oct 06 '23 03:10 xinjiTian