DCNv2 icon indicating copy to clipboard operation
DCNv2 copied to clipboard

RuntimeError: Backward is not reentrant

Open zhengxinvip opened this issue 5 years ago • 1 comments

(pytorch) wuwenfu@wuwenfu:~/DCNv2-master$ python test.py torch.Size([2, 64, 128, 128]) torch.Size([20, 32, 7, 7]) torch.Size([20, 32, 7, 7]) torch.Size([20, 32, 7, 7]) 0.971507, 1.943014 0.971507, 1.943014 Zero offset passed /home/wuwenfu/.conda/envs/pytorch/lib/python3.7/site-packages/torch/autograd/gradcheck.py:239: UserWarning: At least one of the inputs that requires gradient is not of double precision floating point. This check will likely fail if all the inputs are not of double precision floating point. 'At least one of the inputs that requires gradient ' check_gradient_dpooling: True Traceback (most recent call last): File "test.py", line 265, in check_gradient_dconv() File "test.py", line 97, in check_gradient_dconv eps=1e-3, atol=1e-4, rtol=1e-2)) File "/home/wuwenfu/.conda/envs/pytorch/lib/python3.7/site-packages/torch/autograd/gradcheck.py", line 289, in gradcheck return fail_test('Backward is not reentrant, i.e., running backward with same ' File "/home/wuwenfu/.conda/envs/pytorch/lib/python3.7/site-packages/torch/autograd/gradcheck.py", line 224, in fail_test raise RuntimeError(msg) RuntimeError: Backward is not reentrant, i.e., running backward with same input and grad_output multiple times gives different values, although analytical gradient matches numerical gradient how can I fix it? thanks.

zhengxinvip avatar Jul 31 '19 13:07 zhengxinvip

same error, anyone know how to fix it?

mumianyuxin avatar Mar 25 '20 14:03 mumianyuxin