GSM
GSM copied to clipboard
Too long training time on Something-V1
It seems like with my 2 GPU training Something-v1 dataset takes ~3.5 days for 60 epochs.
Do we need to train 60 epochs to get desirable results or it can be obtained with fewer epochs?
Could you tell me please, can we get your results with less number of epochs(f/e, 30-40)? Did you try that? Cause training time is too long
Could you share your .log file please if it is possible?
P/s: I am training num_segments=8 case