PU-GAN
PU-GAN copied to clipboard
About training scheduler
Hi ,
The description of the scheduler in the paper is : “ After 50k iterations, we gradually reduce both rates by a decay rate of 0.7 per 50k iterations until 10^−6.”
This means that the decay starts from 100k. However, the 100k is already larger than the total number of iteration.
Total number of training data : 24,000 Batch_size : 28 Number of epoch : 100 So, the total number of iteration : (24,000/28)*100 = 85,700.
I’d like to know if my understanding is correct. Thank you !