pytorch-vdsr icon indicating copy to clipboard operation
pytorch-vdsr copied to clipboard

Augmentated data up to 15G?

Open YNX940214 opened this issue 6 years ago • 6 comments

I generated the augmentated date with the matlab script and 291 images, and the train.h5 is about 15GB. In the paper the training procedure "takes roughly 4 hours on GPU Titan Z", but it takes far more than 4 hours on my device: GeForce GTX 1080 Ti. Is it because the device or the augmentated data is too big?

YNX940214 avatar Jun 17 '18 02:06 YNX940214

Hi @YNX940214 The size you mentioned about the augmented data is correct, and it does take longer than 4 hours to train in this implementation. The reason is mainly about I didn't use on the fly augmentation during training since I have to use the matlab for bicubic interpolation. Please feel free to modify the code : )

twtygqyy avatar Jun 18 '18 15:06 twtygqyy

@YNX940214 hi, may I ask, how many hours it takes to train the model for 50 epochs? My GPU is TitanX, it seems slow as well. Thx.

MingSun-Tse avatar Oct 19 '18 05:10 MingSun-Tse

@MingSun-Tse Sorry, I forgot. I haven't touch vdsr for a long time. But I may check it a few days later.

YNX940214 avatar Oct 19 '18 05:10 YNX940214

@YNX940214 okay, never mind. Thank you still.

MingSun-Tse avatar Oct 19 '18 06:10 MingSun-Tse

@MingSun-Tse It takes about 20 mins for one epoch on GTX 1080 Ti.

YNX940214 avatar Oct 21 '18 02:10 YNX940214

@twtygqyy hello,I use your code to generated the augmentated date with the matlab script and 291 images, and the train.h5 is about 7GB.I see @YNX940214 train.h5 is about 15G. Could you tell me reason? thanks.

zymize avatar Mar 17 '20 03:03 zymize