HIT
HIT copied to clipboard
Distributed training problem
Hello author, I am now trying to use two Gpus for distributed training, but I do not know why I have been staying here and did not start training, may I ask you to encounter such a situation?
I have not encountered this issue. If you're training JHMDB, one GPU is ok, it should not take long to train.