dlwpt-code
dlwpt-code copied to clipboard
p2ch11: 13,5 hours to train for one epoch
I use an RTX 3060 12GB GPU enabled desktop with RAM of 16GB DDR4 and CPU Intel Core i5 10400F. Also I mounted an external storage HDD drive and ran p2ch11.prepcache… Used from zero to 8 workers and various batch size selections ranging from 32 to 1024!! Still it takes approximately 13,5 hours to train for one epoch (with batch size=1024 and 4 workers!!)… I still haven’t figured what’s wrong… Looks like I cannot utilize the GPU for some reason …