MAML-Pytorch icon indicating copy to clipboard operation
MAML-Pytorch copied to clipboard

About the Model vars(parameters) updating in Meta-training state

Open mRSun15 opened this issue 6 years ago • 1 comments

I see that you use a variable "vars" to store all the variables for the model, I think it's cool and clear, however, my questions is: "Why not using Pytorch's predefined function---- state_dict() and load_state_dict" to update the vars?

Thanks!

mRSun15 avatar May 01 '19 19:05 mRSun15

My understanding is that load_state_dict results in changing the self.net.parameters(). The meta update is with respect to the base parameters self.net.parameters(). If the values change, the gradient is with respect to fast_weights (meta.py line 137). Is it right?

IRNLPCoder avatar Jul 11 '19 16:07 IRNLPCoder