fast-bert
fast-bert copied to clipboard
Using multiple training instances in AWS Sagemaker.
Is it possible to speedup BERT training by using multiple training instances?
You can use p3.8xlarge and above for parallel processing across multiple gpus
You need to set multi_gpu flag to true
Thank you. Which config file do I set this at?
@nectario You would set this in the initialization of BertDataBunch.