pytorch-skipthoughts
pytorch-skipthoughts copied to clipboard
RuntimeError when training
Hey there! I've preprocessed data as you tell (head
of my data file looks like this:)
iron cement is a ready for use paste which is laid as a fillet by putty knife or finger in the mould edges ( corners ) of the steel ingot mould .
iron cement protects the ingot against the hot , abrasive steel casting process .
iron cement is freshly applied after each steel pour in a coating thickness of approx . ~ 2 @-@ 3 mm .
a fire restant repair cement for fire places , ovens , open fireplaces etc .
Translator Internet is a Toolbar for MS Internet Explorer .
However, on training I have the following traceback:
File "/home/user/.virtualenvs/pytorch-env/lib/python3.5/site-packages/pytorch_skipthoughts-0.4.4-py3.5.egg/torchst/train.py", line 357, in step
File "/home/user/.virtualenvs/pytorch-env/lib/python3.5/site-packages/pytorch_skipthoughts-0.4.4-py3.5.egg/torchst/train.py", line 218, in merge_batches
File "/home/user/.virtualenvs/pytorch-env/lib/python3.5/site-packages/pytorch_skipthoughts-0.4.4-py3.5.egg/torchst/train.py", line 218, in <listcomp>
RuntimeError: cat(): argument 'tensors' (position 1) must be tuple of Variables, not Variable
python 3.5 pytorch 0.3
Seems like it fails on the line with x, x_lens, ys_i, ys_t, ys_lens, xys_idx = [torch.cat(d) for d in data]
, but installing fixed version upon the original didn't help either. Thanks!
Hi. I am running into the same error as @mojesty above
RuntimeError: cat(): argument 'tensors' (position 1) must be tuple of Variables, not Variable
Any insight is appreciated. Thanks!
I won't claim this is right or that it won't create other errors, but the following made the error go away:
x, x_lens, ys_i, ys_t, ys_lens, xys_idx = [torch.cat([Variable(e.data) for e in d]) for d in data]
@markriedl After fixing this error I meet the error.
RuntimeError: Expected object of type Variable[torch.cuda.LongTensor] but found type Variable[torch.LongTensor] for argument #2 'index'
Sorry, I'm new to Pytorch and know nothing about it
Huh, I wonder why I didn’t run into this problem? I was debugging code that is not mine, so I don’t really have time to look into this. Tensors can live on the CPU (torch.LongTensor) or on the GPU (torch.cuda.LongTensor). It looks like python wants the tensor moved to the GPU. You do this by calling .cuda()
on LongTensor objects. Since I don’t know which of the variables python is complaining about I wouldn’t be able to do more without some trial and error.