learn2learn icon indicating copy to clipboard operation
learn2learn copied to clipboard

Warning with clone

Open wenzhoulyu opened this issue 3 years ago • 1 comments

E:\Anaconda\envs\RL\lib\site-packages\torch\nn\modules\module.py:385: UserWarning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won't be populated during autograd.backward(). If you indeed want the gradient for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 for more informations. if param.grad is not None: I used learn2learn lib to finished the maml ppo algorithm. But when I changed my net from self.net = nn.Sequential(nn.Linear(self.input_dim, mid_dim), nn.ReLU(), nn.Linear(mid_dim, mid_dim), nn.ReLU(), nn.Linear(mid_dim, mid_dim), nn.ReLU(), nn.Linear(mid_dim, action_dim), into self.gru = nn.GRU(self.input_dim, mid_dim, num_layers=2) self.net = nn.Sequential(nn.Linear(self.mid_dim, mid_dim), nn.ReLU(), nn.Linear(mid_dim, mid_dim), nn.ReLU(), # nn.Linear(mid_dim, mid_dim), nn.ReLU(), nn.Linear(mid_dim, action_dim), ), the warning occurred. And another problem is that if I set the adapt_step too large, my meta_policy after meta updating would output NAN. I'll be appreciate if you can help me solve these problems.

wenzhoulyu avatar Jul 22 '21 11:07 wenzhoulyu

Which version of learn2learn do you have installed? We should support RNNs (I had tested with LSTMs). If you could provide a colab reproducing the issue, I'll try to dig into it.

seba-1511 avatar Aug 06 '21 22:08 seba-1511

Closing since inactive.

seba-1511 avatar May 29 '23 00:05 seba-1511