pytorch-maml icon indicating copy to clipboard operation
pytorch-maml copied to clipboard

About the Model parameters updating in OmniglotNet Class

Open zhaoyu-li opened this issue 5 years ago • 1 comments

Thanks for your good implementation of MAML, however, I think that maybe use state_dict() and load_stat_dict() is much faster than modifying the weights (in omniglot_net.py 43), can I first deepcopy the net parameters(state_dict()) and use the fast weights (also use a optimizer to update), then load the origin parameters back to update the meta learner? Thanks.

zhaoyu-li avatar Jan 30 '20 01:01 zhaoyu-li

I also wanted to do that ,but the grad can not backword and parameters cannot update. is the most important that to share grad between two models??

yucaodie avatar Apr 06 '20 10:04 yucaodie