MateuszGuzek
MateuszGuzek
When you now run example, you encounter the error : ` File "./rl/examples/dreamer/dreamer.py", line 321, in main scaler_actor.scale(actor_loss_td["loss_actor"]).backward() File "~/anaconda3/envs/rl_dreamer_3_10/lib/python3.10/site-packages/torch/_tensor.py", line 487, in backward torch.autograd.backward( File "~/anaconda3/envs/rl_dreamer_3_10/lib/python3.10/site-packages/torch/autograd/__init__.py", line 200, in...
> I think the error that you're getting is due to the fact that you're computing all losses all at once but then you're backpropagating once at a time. In...