MAML-Pytorch icon indicating copy to clipboard operation
MAML-Pytorch copied to clipboard

bug in 2nd order?

Open Interesting6 opened this issue 6 years ago • 4 comments

I see in your code, just using self.net(x_spt[i], fast_weights, bn_training=True)

however the torch.autograd.grad() method contain the following parameter:

create_graph (bool, optional) – If True, graph of the derivative will be constructed, allowing to compute higher order derivative products. Default: False.

Is that means your code just calculates the 1st order derivative?

Thank you!

Interesting6 avatar Jun 13 '19 07:06 Interesting6

self.net is not using torch.autograd.grad()

jayzhan211 avatar Jun 23 '19 03:06 jayzhan211

same confusion

jingjingjing-666 avatar Jul 24 '19 14:07 jingjingjing-666

I think it is a 1st order approximation actually

Vampire-Vx avatar Sep 19 '19 06:09 Vampire-Vx

+1

iamxiaoyubei avatar Mar 10 '20 01:03 iamxiaoyubei