BigGAN-PyTorch icon indicating copy to clipboard operation
BigGAN-PyTorch copied to clipboard

Question regarding the MultiEpochSampler

Open zxhuang97 opened this issue 5 years ago • 1 comments

I've been reading the codebase, and find the MultiEpochSampler looks weird to me. Don't know if I understand it correctly, but it seems to me that in function utils.get_data_loaders train_set is a vanilla data loader that read through all the data once, namely one epoch of data. sampler is a MultiEpochSampler whose length is len(train_set) * num_epoch. While in training, it's a nested loop that iterates through sampler for num_epoch times. In total, the model is trained on num_epoch^2 * len(train_set) images, and the "real" epoch is actually num_epoch^2.

I'm wondering if it's on purpose or a bug. Thank you very much.

zxhuang97 avatar Sep 12 '19 03:09 zxhuang97

Having the same doubles here

JanineCHEN avatar Sep 18 '20 16:09 JanineCHEN