Medical-Transformer
Medical-Transformer copied to clipboard
Question about the training speed
Hello, author. Thank you for your code. CUDA has been used and the batch size has been adjusted to 4, but the training speed of the model is still very slow. Are you in this situation? How can I solve this problem? Look forward to your reply.
Hello, author. Thank you for your code. CUDA has been used and the batch size has been adjusted to 4, but the training speed of the model is still very slow. Are you in this situation? How can I solve this problem? Look forward to your reply.
I am also facing the same problem.
Even though having multiple GPUs, its still not utilising GPU to full extent and training very slowly.
Hello, author. Thank you for your code. CUDA has been used and the batch size has been adjusted to 4, but the training speed of the model is still very slow. Are you in this situation? How can I solve this problem? Look forward to your reply.
Hi, I am facing the same problem. The reason is that the code uses the "for" in Python when processing the local branch. But I don't have any method to solve it.