FTorch icon indicating copy to clipboard operation
FTorch copied to clipboard

`requires_grad` *should* be carried through expressions

Open joewallwork opened this issue 7 months ago • 1 comments

Closes #370.

It turns out I was wrong in #364. Some simple Python experimentation shows that we should be carrying requires_grad over through expressions.

>>> a = torch.Tensor([1])
>>> a.requires_grad = True
>>> a
tensor([1.], requires_grad=True)
>>> b = torch.Tensor([2])
>>> c = a + b
>>> c
tensor([3.], grad_fn=<AddBackward0>)
>>> c.requires_grad
True

This PR reverts the changes from #364 and drops the related check that was causing issues.

Further, we should actually be overwriting with requires_grad=False when appropriate, too:

>>> import torch
>>> a = torch.Tensor([1])
>>> a.requires_grad=True
>>> a
tensor([1.], requires_grad=True)
>>> b = torch.Tensor([1])
>>> b
tensor([1.])
>>> a = b + b
>>> a
tensor([2.])

joewallwork avatar May 02 '25 11:05 joewallwork