torch
torch copied to clipboard
Bug when using `requires_grad = TRUE` with `torch_scalar_tensor`
library(torch)
x <- torch_scalar_tensor(1, requires_grad = TRUE)
y <- torch_scalar_tensor(2, requires_grad = TRUE)
out <- x + y
out$backward()
print(x$grad)
#> torch_tensor
#> [ Tensor (undefined) ]
print(y$grad)
#> torch_tensor
#> [ Tensor (undefined) ]
Created on 2025-10-14 with reprex v2.1.1
Hello @sebffischer,
The print(x$grad) output is now decorated with an explicit warning message (at least on my machine)
> library(torch)
> x <- torch_scalar_tensor(1, requires_grad = TRUE)
> y <- torch_scalar_tensor(2, requires_grad = TRUE)
>
> out <- x + y
> out$backward()
>
> print(x$grad)
[W1106 08:55:23.585101252 TensorBody.h:489] Warning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won't be populated during autograd.backward(). If you indeed want the .grad field to be populated for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 for more informations. (function grad)
torch_tensor
[ Tensor (undefined) ]
>
Hope it helps,
@cregouby Yeah, but it's still a (I think easy to fix bug). The problem is that torch_scalar_tensor calls $squeeze() internally, so the result is not a leaf tensor anymore.