Shunted-Transformer
Shunted-Transformer copied to clipboard
About distributed training
Exciting work. 👍I'm trying to run the dist_train.sh script you gave. And get the error below. I only have one GPU. Is this the error caused by this? Or is this only for distributed training? Is there another training method
Looking forward to your reply. Thank you.
Hi, you can see the detail of the dist_train.sh script, and then you can just run the main.py script with some parameter. have a try