meta-transfer-learning icon indicating copy to clipboard operation
meta-transfer-learning copied to clipboard

How do you update the base_leaner's parameters

Open xugy16 opened this issue 2 years ago • 3 comments

Thank you for the code.

I have a question about the base_learner update.

  1. The base-learner is fast updated using 100 steps.
  2. then we return qry_logits and calculate the cross entropy loss for qry_set
  3. using sel.optimizer to update.

But what gradient is stored in base_learner? Because you use fast-model to calcualte qry loss

xugy16 avatar Jul 01 '22 05:07 xugy16

Thanks for your interest in our work.

The fast model can be regarded as a function of the base learner. Thus, we can calculate the derivative of the query loss with respect to the base model. We update the base learner using Eq. 5 in our paper.

If you have any further questions, please feel free to contact me.

Best,

Yaoyao

yaoyao-liu avatar Jul 01 '22 14:07 yaoyao-liu

Really appreciate for the response.

So you are using 1st-Order MAML to update the classifier-head (base learner)?

xugy16 avatar Jul 02 '22 03:07 xugy16

Yes. We use the first-order approximation MAML to update the FC classifier.

If you have any further questions, please do not hesitate to contact me.

yaoyao-liu avatar Jul 02 '22 22:07 yaoyao-liu