Transformer
Transformer copied to clipboard
Run Time Error and Transfer Learning?
I got the following error while compiling
python train.py -src_data data/europarl-v7_de.txt -trg_data data/europarl-v7_en.txt -src_lang de -trg_lang en -SGDR -epochs 10 -checkpoint 10 -batchsize 128 -load_weights weights
loading spacy tokenizers...
loading presaved fields...
creating dataset and iterator...
The device
argument should be set by using torch.device
or passing a string as an argument. This behavior will be deprecated soon and currently defaults to cpu.
Traceback (most recent call last):
File "train.py", line 185, in