pytorch-vdsr
pytorch-vdsr copied to clipboard
Augmentated data up to 15G?
I generated the augmentated date with the matlab script and 291 images, and the train.h5 is about 15GB. In the paper the training procedure "takes roughly 4 hours on GPU Titan Z", but it takes far more than 4 hours on my device: GeForce GTX 1080 Ti. Is it because the device or the augmentated data is too big?
Hi @YNX940214 The size you mentioned about the augmented data is correct, and it does take longer than 4 hours to train in this implementation. The reason is mainly about I didn't use on the fly augmentation during training since I have to use the matlab for bicubic interpolation. Please feel free to modify the code : )
@YNX940214 hi, may I ask, how many hours it takes to train the model for 50 epochs? My GPU is TitanX, it seems slow as well. Thx.
@MingSun-Tse Sorry, I forgot. I haven't touch vdsr for a long time. But I may check it a few days later.
@YNX940214 okay, never mind. Thank you still.
@MingSun-Tse It takes about 20 mins for one epoch on GTX 1080 Ti.
@twtygqyy hello,I use your code to generated the augmentated date with the matlab script and 291 images, and the train.h5 is about 7GB.I see @YNX940214 train.h5 is about 15G. Could you tell me reason? thanks.