glow-pytorch
glow-pytorch copied to clipboard
Positive logdet
Hi, In one flow step, there are actnorm, permute, and coupling. The sigmoid(scale+2) ensures coupling layer gives non-positive logdet. But there is no similar operation on actnorm and permute to ensure this. I encounter positive logdet at final output when using network configuration K=20, L =3, because of this problem I would appreciate it for any idea to solve this.
Hello, I meet the same problem. when I use the original hyparameters to run the model with multipie, I got negative nll. But it should be positve. Do you solve this porblem? I would appreciate it for any help!
Could you please explain why you would want a positive logdet?
I don't see why you would want this, negative nll can very well be positive in the case where the random variables are continuous.
Maybe I am missing something...
Thanks in advance
Hi, I think it is the positive logdet caused by operation of actnorm and permute, that give the negative nll at final. By default, I have the mind set that nll should always be above 0 due to its -log(p) definition. Then I realize that p(x) is value of pdf and thus can be any positive value. Therefore it is ok that nll is negative.
the logdet has no reason to be positive ... depends if the function is contractive or expanding
Hi, first of all, I do not want to have positive logdet. I encountered negative nll value while training, and looking into this problem shows that negative nll is due to the positive logdet at the normal_flow function:
def normal_flow(self, input, logdet): assert input.size(1) % 2 == 0 # 1. actnorm z, logdet = self.actnorm(input, logdet=logdet, reverse=False) # 2. permute z, logdet = FlowStep.FlowPermutation[self.flow_permutation]( self, z, logdet, False) # 3. coupling z1, z2 = thops.split_feature(z, "split") if self.flow_coupling == "additive": z2 = z2 + self.f(z1) elif self.flow_coupling == "affine": h = self.f(z1) shift, scale = thops.split_feature(h, "cross") scale = torch.sigmoid(scale + 2.) z2 = z2 + shift z2 = z2 * scale logdet = thops.sum(torch.log(scale), dim=[1, 2, 3]) + logdet z = thops.cat_feature(z1, z2) return z, logdet
I was saying, the three steps does not guarantee the sign of logdet. Then logdet can be positive.