Transformer icon indicating copy to clipboard operation
Transformer copied to clipboard

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!

Open trra1988 opened this issue 3 years ago • 2 comments

Appriciate for release code, I have a little question is how to set gpu to train the model, when I train the model this error show up, thanks

""" The device argument should be set by using torch.device or passing a string as an argument. This behavior will be deprecated soon and currently defaults to cpu. training model... Traceback (most recent call last): ] 0% loss = ... File "train.py", line 183, in main() File "train.py", line 111, in main train_model(model, opt) File "train.py", line 34, in train_model src_mask, trg_mask = create_masks(src, trg_input, opt) File "/home/lin/program/Transformer-master/Batch.py", line 26, in create_masks trg_mask = trg_mask & np_mask RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! """

trra1988 avatar Apr 23 '21 07:04 trra1988

@trra1988 you may need to use :

src_mask, trg_mask = create_masks(src.cuda(), trg_input.cuda(), opt)

instead of

 src_mask, trg_mask = create_masks(src, trg_input, opt) 

A-Kerim avatar Apr 23 '21 07:04 A-Kerim

@A-Kerim question is solved, thanks

trra1988 avatar Apr 23 '21 09:04 trra1988