nmt
nmt copied to clipboard
question about step-time in tensor flow nmt
Hello. Thank you for sharing this great tutoral about neural machine translation.
I read that the training on wmt16 corpus requires 0.7s step-time on Nvidia Titan X. Is that mean to go through 340000 steps, it requires time: 340000 * 0.7 =238000 seconds(= 67 hours)
I trained with 1 RTX 2080 GPU and I accumulate that it tooks about 68s between each 100 steps. Is that mean 1 step-time is 68/100=0.68s, and it requires 340000 * 0.68= 231000 seconds (=64.2 hours)
My questions are:
- Am I correct in your step-time definition?
- I checked the price and see the Nvidia titanx is much more expensive than RTX 2080. So that I assumed the RTX must be worse performance than the titan x. Is there any possible cause for the same performance of the two?
- In your experiment, the Nvidia 40 K took longer time than the titan X though the first one is much more expensive. Is that mean the 40K has worse performance than the other?
Thank you again