torch-autograd
torch-autograd copied to clipboard
torch.log() does not accept number as arguments
Hi, when trying to do util.logSoftMax(Z)
, got
/home/ubuntu/lib/torch/install/bin/luajit: invalid arguments: number
expected arguments: [*CudaTensor*] CudaTensor
stack traceback:
[C]: at 0x7f24a87612e0
[C]: in function 'fn'
...all/share/lua/5.1/autograd/runtime/direct/DirectTape.lua:51: in function 'log'
...ubuntu/lib/torch/install/share/lua/5.1/autograd/util.lua:77: in function 'logSumExp'
...ubuntu/lib/torch/install/share/lua/5.1/autograd/util.lua:81: in function
(it worked with torch.setdefaulttensortype('torch.FloatTensor')
, but failed with torch.setdefaulttensortype('torch.CudaTensor')
)
Can you provide a small snippet of runnable code to reproduce?
Hey @alexbw,
try running this code with CudaTensor
and FloatTensor
local autograd = require 'autograd'
require 'cutorch'
torch.setdefaulttensortype('torch.CudaTensor')
local f = function(p)
local x = torch.sum(torch.zeros(1,30))
print(x)
local y = torch.log(x) -- error
end
local df = autograd(f, { inline = false })
df(params)
(updated)
I updated the code above.
- It seems that torch.sum() returned a number when no dimension were specified, which is a correct behavior I think.
- The error occurred when the number is passed into torch.log() which only accepts Tensor as it's argument.
- It worked in FloatTensor, but not CudaTensor