Transformer
Transformer copied to clipboard
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!
File "E:\Transformer-master\Batch.py", line 26, in create_masks trg_mask = trg_mask & np_mask RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!
In the past, he worked with me without problems, but after I format the device and download the programs and offices again, this problem happened to me and I do not know the reason.
This error raised because you ran this project on GPU but some tensor ran on CPU.
I know what is the problem I want to found a solution.
I try to move the trg_mask to GPU
but I have another problem the input and output must in the same device.
To solve this problem you have must move trg_mask to CPU again after the operation.
but not moved again I don't know why.
I installed old ver of Pytorch but is still the same problem
def create_masks(src, trg, opt):
src_mask = (src != opt.src_pad).unsqueeze(-2).to(opt.device)
if trg is not None:
trg_mask = (trg != opt.trg_pad).unsqueeze(-2).to(opt.device)
size = trg.size(1) # get seq_len for matrix
np_mask = nopeak_mask(size, opt).to(opt.device)
trg_mask = trg_mask & np_mask
else:
trg_mask = None
return src_mask, trg_mask
You can try this code.
I have tried the code above but i also does not work !!!
I fixed this the problem I move the src, trg to cuda before call the create_masks function in traning file line 30 I change: src = batch.src.transpose(0,1).cuda() trg = batch.trg.transpose(0,1).cuda() trg_input = trg[:, :-1] src_mask, trg_mask = create_masks(src, trg_input, opt)
Thank you @anasAloklah. Your solution worked for me!
Your solution worked for me! too thanks a lot