Transformer
Transformer copied to clipboard
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!
Appriciate for release code, I have a little question is how to set gpu to train the model, when I train the model this error show up, thanks
"""
The device
argument should be set by using torch.device
or passing a string as an argument. This behavior will be deprecated soon and currently defaults to cpu.
training model...
Traceback (most recent call last): ] 0% loss = ...
File "train.py", line 183, in
@trra1988 you may need to use :
src_mask, trg_mask = create_masks(src.cuda(), trg_input.cuda(), opt)
instead of
src_mask, trg_mask = create_masks(src, trg_input, opt)
@A-Kerim question is solved, thanks