diffae icon indicating copy to clipboard operation
diffae copied to clipboard

How many epochs required for training?

Open hao-pt opened this issue 3 years ago • 2 comments

When digging in your code, I found that training is based on a number of total iterations (let's call max_steps). Based on your codes in experiment.py, it is computed as max_steps=conf.total_samples // conf.batch_size_effective.

total_samples is predefined at template.py (e.g. 130_000_000 for ffhq128) and batch_size_effective is set to 128 by default. For this example, max_steps = 1_015_625. As FFHQ128 includes 70,000 samples, a number of required epochs are 1_015_625 / (70_000 / 128)) ~ 1857 (this is such a huge epoch to train :(( )

Might you let me know that I am correct?

hao-pt avatar Jul 13 '22 11:07 hao-pt

Your understanding is correct. Note that the number of "samples" here is comparable to other DDPM on the same dataset. Maybe, the number of epochs is not interpreted the same way as in a classification model when you are dealing with a generative model?

konpatp avatar Jul 13 '22 11:07 konpatp

Your understanding is correct. Note that the number of "samples" here is comparable to other DDPM on the same dataset. Maybe, the number of epochs is not interpreted the same way as in a classification model when you are dealing with a generative model?

Hi, may I know your training time in total for your DiffAE on FFHQ256?

tyrink avatar Dec 10 '22 13:12 tyrink