SRFlow
SRFlow copied to clipboard
How does the Normalizing Flow loss in SRFlow behave?
I followed this repo, paper and glow paper. My loss looks like this
_logp = GaussianDiag.logp(None, None, (z))
obj = logdets + logp
loss = -obj / (tf.math.log(2.) * tf.cast(pixels, tf.float32))_
There are 2 terms added together : one is logp which is always negative, and logdets which starts with a negative number during training, because of the minus sign in -obj the loss is positive, but after training for around 100k steps, both the logp and logdets increases. The logp term increases and becomes positive number and hence the final loss becomes negative around -3.xxx. Just wanted to know whether it is an expected behaviour ?
The final loss of SRFlow is negative and steadily decreasing:

We did not track those two components separately. Please try to run our code and log the loss (NLL is defined here).
Did that help?
Hi @andreas128,
Thanks, that is something I was looking for and it clears my doubt about the logdets term.