torch-autograd icon indicating copy to clipboard operation
torch-autograd copied to clipboard

Incorrect gradient for assignment

Open yongfei25 opened this issue 8 years ago • 2 comments

Hi,

If you add a third assignment here, for example: xc[3] = 1, gradient check will say incorrect gradient. I'm not sure is this issue with assignment or the gradient check.

https://github.com/twitter/torch-autograd/blob/86a5963dd6e3cfd9c3b29fcddcf2edb7c3759ac4/test/test.lua#L1471-L1472

Can you help me with this? Thanks :)

*Update, it seems to happen only in optimized mode.

yongfei25 avatar Apr 03 '16 08:04 yongfei25

FYI, assignment doesn't work in non-optimized mode. This is pretty weird, @luketwitter any ideas?

alexbw avatar Apr 29 '16 15:04 alexbw

Hey, would really like to get this fixed. Here's a bit of debugging I've tried.

local autograd = require 'autograd'
autograd.optimize(true)

local f2 = function(params)
   local xc = torch.clone(params.x)
   xc[1] = 2
   xc[2] = 3
   xc[3] = 4
   return torch.sum(xc)
end

local df = autograd(f2)
df({x=torch.randn(10),y=torch.randn(3)})

-- Generate the following codes
-- ...
rlocals[1][1] = 2
rlocals[1][2] = 3
locals[3] = torch_sum(rlocals[1])
rlocals[1][3] = 4
-- ...

It seem to be me that rlocals[1][3] is not taken into torch_sum. I've also look into the internal of code compilation, do you have any idea which area to start looking at?

yongfei25 avatar Jul 07 '16 13:07 yongfei25