oneflow icon indicating copy to clipboard operation
oneflow copied to clipboard

autograd.backward 和 autograd.grad 在同一个 tensor 上使用时结果有误

Open wyg1997 opened this issue 2 years ago • 0 comments

Summary

由于 autograd.grad 的机制问题,会借用 tensor.retain_grad 属性,把临时的 grad 存在 tensor.grad 上,导致的结果错误。

Code to reproduce bug

import oneflow as flow
#  import torch as flow

a = flow.ones(2, 3).requires_grad_()
b = a ** 2
b.backward(flow.ones_like(b))
print(a.grad)

# a.grad = None  # 只有这样重置 tensor.grad 计算才正常
b = a ** 2
a_grad = flow.autograd.grad(b, a, flow.ones_like(b), create_graph=True)[0]
print(a_grad)  # OneFlow get the tensor with 4.0, but it should be 2.0
print(a_grad.requires_grad)  # OneFlow get False, but it will be True, because a_grad = 2.0 * a * b_grad

wyg1997 avatar Nov 08 '22 03:11 wyg1997