Forward-Warp
Forward-Warp copied to clipboard
Grad of all-zero flow
In the simplest case, if the flow is a zero tensor, all source elements are copied to the same position in the target tensor. If flow changes slightly, the target tensor will also change, that means the flow gradient is non-zero. However, this test code shows otherwise:
from Forward_Warp import forward_warp
import torch
a = torch.randn(1,1,5,5)
flow = torch.zeros(1,5,5,2, requires_grad=True)
fwarp = forward_warp()
b =fwarp(a, flow)
b.sum().backward()
print(flow.grad)
# flow.grad is an all-zero tensor.
Any idea what's the issue? Thanks.
I also tried to use random flow, and flow.grad is still all-zero. This seems quite weird.
Phishing attack. I've reported to github.