BiSeNet icon indicating copy to clipboard operation
BiSeNet copied to clipboard

Why training an ITER is over

Open XXMxxm220 opened this issue 2 years ago • 7 comments

Why training an ITER is over?

XXMxxm220 avatar May 23 '23 03:05 XXMxxm220

训练一个epoch后 不再继续

XXMxxm220 avatar May 23 '23 03:05 XXMxxm220

您好,问题解决了没

zhanghaowei01 avatar May 26 '23 06:05 zhanghaowei01

您好,问题解决了没

解决啦

XXMxxm220 avatar May 26 '23 06:05 XXMxxm220

分享一下呗

zhanghaowei01 avatar May 26 '23 06:05 zhanghaowei01

分享一下呗

你好,我采用的方法是在for it 前,加入了for epoch in range epochs:;epochs是我自己设定的训练周期,再将save_model语句也包含在for epoch 内,保证每个epch保存一次模型。

XXMxxm220 avatar May 26 '23 06:05 XXMxxm220

Why do you need this? You can compute the total iterations by len(dataset) * n_epoches / batch_size, and set the total iterations as this computation result.

CoinCheung avatar Jun 08 '23 07:06 CoinCheung

Why do you need this? You can compute the total iterations by , and set the total iterations as this computation result.len(dataset) * n_epoches / batch_size

You are right .Thank you for your response.I just wanted to use epoch to mean a training, not iters,so I choose that way. This approach also does not affect the final output

XXMxxm220 avatar Jun 08 '23 07:06 XXMxxm220