A2J-Transformer
A2J-Transformer copied to clipboard
Training time
Could you please provide a detailed description of the GPU used and training time? I am trying to reproduce your results. I set the batch_size to 24 and used dual 3090 GPUs for training. It cost 4.34h for one epoch. Is this reasonable?
I have the same question,I don't know if it's my own problem,Hope to receive a response from the author!
Hi, thanks for your attention! It is reasonable. Due to the InterHand2.6M dataset is really huge, the time cost is relatively large.