meta-transfer-learning icon indicating copy to clipboard operation
meta-transfer-learning copied to clipboard

Question Regarding Meta-train of PyTorch Implementation

Open khaghanijavad opened this issue 4 years ago • 2 comments

Hi, thanks for your interesting work. I have a question regarding meta-training step in PyTorch implementation. In line 167-169 of 'meta-py' the parameters of the classifier are updated according to the equation #3 in paper using the loss for training samples. Also, the loss for validation samples (according to equation #4 and #5 in the paper) should be back-propagated to do the second loop of meta-learning. However, in the released PyTorch implementation, I can see only the forward pass using validation samples, and the loss is not back-propagated to optimise SS parameters & theta (equation #4 and #5). I would really appreciate it if you could clarify this for me. Thank you very much.

khaghanijavad avatar Aug 23 '20 22:08 khaghanijavad

Hi,

Thanks for your interest in our work.

The back-propagating is conducted as follows, https://github.com/yaoyao-liu/meta-transfer-learning/blob/d4ab548fe4258bab8e1549c8e1c9be175d52afe1/pytorch/trainer/meta.py#L169

And the optimizer is defined as follows, https://github.com/yaoyao-liu/meta-transfer-learning/blob/d4ab548fe4258bab8e1549c8e1c9be175d52afe1/pytorch/trainer/meta.py#L61-L62 where self.model.encoder.parameters() are the SS parameters and self.model.base_learner.parameters() is theta.

So the SS parameters and theta are optimized by the meta loss. If you have any further questions, feel free to add additional comments.

Best, Yaoyao

yaoyao-liu avatar Aug 23 '20 23:08 yaoyao-liu

Thank you very much for your prompt response.

khaghanijavad avatar Aug 24 '20 19:08 khaghanijavad