functorch icon indicating copy to clipboard operation
functorch copied to clipboard

Get .item() error without calling .item()

Open LiXinrong1012 opened this issue 3 years ago • 1 comments

Hello guys, I'm new to this package and I want to calculate batched Jacobian w.r.t a self-implemented vector function. But I got the following error when I'm doing this.

RuntimeError: vmap: It looks like you're calling .item() on a Tensor. We don't support vmap over calling .item() on a Tensor, please try to rewrite what you're doing with other operations. If error is occurring somewhere inside PyTorch internals, please file a bug report.

Here is my code. I don't understand where the .item() comes from. Is this slicing operation q_current[0:3] wrong? How can I fix this?

import torch
from functorch import jacrev,vmap

#batch * len
q_current = torch.randn((4,4*3-1),requires_grad=True)


def geoCompute(q_current):
    k1 = q_current[0:3]
    return k1


jacobian = vmap(jacrev(geoCompute))(q_current)

LiXinrong1012 avatar Sep 20 '22 07:09 LiXinrong1012

What version of functorch are you using? Your minimal reproducible script works for me. It returns,

tensor([[[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
         [0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
         [0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 0.]],

        [[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
         [0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
         [0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 0.]],

        [[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
         [0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
         [0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 0.]],

        [[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
         [0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
         [0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 0.]]])

AlphaBetaGamma96 avatar Sep 20 '22 12:09 AlphaBetaGamma96