electra
electra copied to clipboard
pretrain with multigpu
Hi, guys, can we use multi-gpus to pretrain electra-small or electra-base now. Anyone figured it out? thanks
Check out the NVIDIA's implementation. https://github.com/NVIDIA/DeepLearningExamples/tree/master/TensorFlow2/LanguageModeling/ELECTRA
Or this: https://github.com/richarddwang/electra_pytorch
Or this: https://github.com/richarddwang/electra_pytorch
Hello,I tried electra_pytorch, but only one GPU was running. Have you ever run it with multigpu? How did you do that?