SRFlow icon indicating copy to clipboard operation
SRFlow copied to clipboard

How does the Normalizing Flow loss in SRFlow behave?

Open avinash31d opened this issue 3 years ago • 2 comments

I followed this repo, paper and glow paper. My loss looks like this

    _logp =  GaussianDiag.logp(None, None, (z))
    obj = logdets + logp
    loss = -obj / (tf.math.log(2.) * tf.cast(pixels, tf.float32))_

There are 2 terms added together : one is logp which is always negative, and logdets which starts with a negative number during training, because of the minus sign in -obj the loss is positive, but after training for around 100k steps, both the logp and logdets increases. The logp term increases and becomes positive number and hence the final loss becomes negative around -3.xxx. Just wanted to know whether it is an expected behaviour ?

avinash31d avatar Jan 22 '21 16:01 avinash31d

The final loss of SRFlow is negative and steadily decreasing:

PyTorch Normalizing Flow Loss

We did not track those two components separately. Please try to run our code and log the loss (NLL is defined here).

Did that help?

andreas128 avatar Jan 23 '21 14:01 andreas128

Hi @andreas128,

Thanks, that is something I was looking for and it clears my doubt about the logdets term.

avinash31d avatar Jan 23 '21 14:01 avinash31d