mitsuba2 icon indicating copy to clipboard operation
mitsuba2 copied to clipboard

Cannot call backward() on loss tensor more than once even after setting retain_graph to True

Open mehrab2603 opened this issue 4 years ago • 0 comments

Summary

backward() cannot be called more than once on any tensor dependent on the output tensor of render_torch().

System configuration

  • Platform: Windows 10 / Ubuntu 20.04 (Mac too probably)
  • Compiler: VS 2019 16.5.2 / clang-9
  • Python version: 3.7.6
  • Mitsuba 2 version: 2.1.0
  • Compiled variants:
    • gpu_autodiff_rgb

Description

Any loss tensor defined with the output tensor of render_torch function allows calling backward() on it only once. Calling backward(retain_graph=True) does not help. The error produced on the second call of backward() is as follows:

render_torch(): critical exception during backward pass: 'RenderBackward' object has no attribute 'output'
Traceback (most recent call last):
  File "c:\Users\Rabbi\Documents\DynamicStructuredLight\test.py", line 63, in <module>
    ob_val.backward(retain_graph=True)
  File "C:\Users\Rabbi\Anaconda3\lib\site-packages\torch\tensor.py", line 166, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph)
  File "C:\Users\Rabbi\Anaconda3\lib\site-packages\torch\autograd\__init__.py", line 99, in backward
    allow_unreachable=True)  # allow_unreachable flag
  File "C:\Users\Rabbi\Anaconda3\lib\site-packages\torch\autograd\function.py", line 77, in apply
    return self._forward_cls.backward(self, *args)
  File "C:\Users\Rabbi\Documents\mitsuba2\build\dist\python\mitsuba\python\autodiff.py", line 471, in backward
    raise e
  File "C:\Users\Rabbi\Documents\mitsuba2\build\dist\python\mitsuba\python\autodiff.py", line 458, in backward
    ek.set_gradient(ctx.output, ek.detach(Float(grad_output)))
AttributeError: 'RenderBackward' object has no attribute 'output'

I think this is because of this?

Steps to reproduce

  1. Modify the differentiable rendering section's Pytorch integration example by repeating this line, putting retain_graph=True in the backward() function calls.
  2. Run the modified example.

mehrab2603 avatar Jul 14 '20 02:07 mehrab2603