pytorch-optimizer icon indicating copy to clipboard operation
pytorch-optimizer copied to clipboard

lamb optimizer mistake

Open trofimovaolga opened this issue 3 years ago • 0 comments

Hi, I was checking your lamb implementation and I think there is a mistake in it. According to the paper, exp_avg and exp_avg_sq (m and v) must be updated this way: m /= (1 - betta_1t) v /= (1 - betta_2t) In your implementation they are not updated and so even if self.debias==True, there is still update missing from adam_norm. Please correct me if I'm wrong

trofimovaolga avatar Jun 24 '22 15:06 trofimovaolga