A2J-Transformer icon indicating copy to clipboard operation
A2J-Transformer copied to clipboard

Training time

Open HITKJ opened this issue 1 year ago • 2 comments

Could you please provide a detailed description of the GPU used and training time? I am trying to reproduce your results. I set the batch_size to 24 and used dual 3090 GPUs for training. It cost 4.34h for one epoch. Is this reasonable?

HITKJ avatar May 10 '23 13:05 HITKJ

I have the same question,I don't know if it's my own problem,Hope to receive a response from the author!

liangshenglei avatar Dec 20 '23 07:12 liangshenglei

Hi, thanks for your attention! It is reasonable. Due to the InterHand2.6M dataset is really huge, the time cost is relatively large.

ChanglongJiangGit avatar Dec 20 '23 13:12 ChanglongJiangGit