MAML-Pytorch
MAML-Pytorch copied to clipboard
bug in 2nd order?
I see in your code, just using
self.net(x_spt[i], fast_weights, bn_training=True)
however the torch.autograd.grad() method contain the following parameter:
create_graph (bool, optional) – If True, graph of the derivative will be constructed, allowing to compute higher order derivative products. Default: False.
Is that means your code just calculates the 1st order derivative?
Thank you!
self.net is not using torch.autograd.grad()
same confusion
I think it is a 1st order approximation actually
+1