bert-han
bert-han copied to clipboard
Parallel model
I have a question: how to parallel your model using BERT?
Hi @dungdx34 I'm not sure I fully understand your question. What exactly do you mean by parellel the model using BERT? The architecture of the model is using BERT for each sentence in a sperate way, such that each document is processed in one pass in the feed forward, if that's what you are asking.
Thank you for your reply! My problem is that I want to train your model on multi-GPU, but your project does not support multi-GPU. In PyTorch, if I want a parallel model, I use torch.nn.DataParallel, example: self.model = nn.DataParallel(self.model) Please help me!