Forward-Warp icon indicating copy to clipboard operation
Forward-Warp copied to clipboard

Grad of all-zero flow

Open askerlee opened this issue 3 years ago • 2 comments

In the simplest case, if the flow is a zero tensor, all source elements are copied to the same position in the target tensor. If flow changes slightly, the target tensor will also change, that means the flow gradient is non-zero. However, this test code shows otherwise:

from Forward_Warp import forward_warp
import torch

a = torch.randn(1,1,5,5)
flow = torch.zeros(1,5,5,2, requires_grad=True)
fwarp = forward_warp()
b =fwarp(a, flow)
b.sum().backward()
print(flow.grad)
# flow.grad is an all-zero tensor.

Any idea what's the issue? Thanks.

askerlee avatar May 03 '22 14:05 askerlee

I also tried to use random flow, and flow.grad is still all-zero. This seems quite weird.

askerlee avatar May 03 '22 15:05 askerlee

Phishing attack. I've reported to github.

askerlee avatar Feb 26 '24 05:02 askerlee