manifold_mixup icon indicating copy to clipboard operation
manifold_mixup copied to clipboard

Question about training epoch?

Open gaow0007 opened this issue 5 years ago • 2 comments

I have a question about the result of epoch. Why do you use 600-2000 epoch to validate the superiority of your method? I think that the epoch number is too large and sometimes I only use 200 epoch to train these tiny dataset. Any reasons about settings of epoch?

Best

gaow0007 avatar Apr 12 '19 03:04 gaow0007

We achieved better results when running for more epochs (for both manifold mixup and our baselines), but we definitely saw an improvement with manifold mixup over input mixup for 600 epochs.

I think it helps with an even smaller number of epochs (<600) but I don't recall if I've actually run that experiment.

alexmlamb avatar Apr 12 '19 04:04 alexmlamb

Another way to think about this is that Manifold Mixup is a stronger regularizer and hence you need more training epochs to train with it.

vikasverma1077 avatar May 02 '19 17:05 vikasverma1077