res-loglikelihood-regression
res-loglikelihood-regression copied to clipboard
bar_mu computation is different from the paper in Eq (5)
Hi there,
It seems the bar_mu computation is different. Should be multiplying a "-1". (below Eqn (5), bar_mu = (gt - mu_pred) / sigma.
As shown here:
https://github.com/Jeff-sjtu/res-loglikelihood-regression/blob/203dc3195ee5a11ed6f47c066ffdb83247511359/rlepose/models/regression_nf.py#L134
This does not affect the computation of log_Q, which basically using the abs of this term. How about the flow model? Not sure if this leads to any difference in the learning of the flow model RealNVP, or did i miss something here?
Thanks.
I have the same question,have you found the answer?
Not really. I also post the same question in mmpose repo. https://github.com/open-mmlab/mmpose/issues/1825#issuecomment-1321510025
The authors mentioned the pull request when integrating RLE loss. However, the pull request is discussing the other bug on sigma, not on bar_mu. The bug on sigma is fixed in mmpose. This repo keeps the old version.
For bar_mu, I run two experiments using mmpose. I got similar results. It seems that the sign of bar_mu have no impact on the final result.