PCC-pytorch
PCC-pytorch copied to clipboard
Curvature Loss is Dimensionally Incorrect
Hey I first wanted to say that this is great work. I also wanted to point out that in the non-amortized curvature loss if you look at the shapes:
grad_z is [batchsize x latent_dim] grad_u is [batchsize x action_dim]
when I think in the paper grad_u should be [batchsize x latent_dim].
So, I think right now the way the curvature loss is formatted it is non-sensical dimensionally since if action_dim ≠ latent_dim or action_dim ≠ 1 then you'd get a dimension mismatch. Let me know if you have any questions.