HowToTrainYourMAMLPytorch
HowToTrainYourMAMLPytorch copied to clipboard
About zero_grad()
I found the call of self.optimizer.zero_grad()
and self.zero_grad()
after self.meta_update(loss=losses['loss'])
, what is the purpose of them? It seems like self.optimizer.zero_grad()
was already called in self.meta_update(loss=losses['loss'])
. As for self.zero_grad()
, I couldn't get the aim of it. Could you please explain them? Thanks a lot!