Medical-Transformer icon indicating copy to clipboard operation
Medical-Transformer copied to clipboard

Question about the training speed

Open JackHeroFly opened this issue 3 years ago • 3 comments

Hello, author. Thank you for your code. CUDA has been used and the batch size has been adjusted to 4, but the training speed of the model is still very slow. Are you in this situation? How can I solve this problem? Look forward to your reply.

JackHeroFly avatar Oct 12 '21 01:10 JackHeroFly

Hello, author. Thank you for your code. CUDA has been used and the batch size has been adjusted to 4, but the training speed of the model is still very slow. Are you in this situation? How can I solve this problem? Look forward to your reply.

I am also facing the same problem.

MukulKadaskar avatar Mar 08 '23 16:03 MukulKadaskar

Even though having multiple GPUs, its still not utilising GPU to full extent and training very slowly.

MukulKadaskar avatar Mar 08 '23 16:03 MukulKadaskar

Hello, author. Thank you for your code. CUDA has been used and the batch size has been adjusted to 4, but the training speed of the model is still very slow. Are you in this situation? How can I solve this problem? Look forward to your reply.

Hi, I am facing the same problem. The reason is that the code uses the "for" in Python when processing the local branch. But I don't have any method to solve it.

canglangzhige avatar Nov 10 '23 02:11 canglangzhige