Optimus
Optimus copied to clipboard
Question about mutual information
Hello, thank you very much for making the code available. I'm confused about the mutual information math, more specifically about the line
E_{q(z|x)}log(q(z|x)) = -0.5*nz*log(2*\pi) - 0.5*(1+logvar).sum(-1)
neg_entropy += (-0.5 * nz * math.log(2 * math.pi)- 0.5 * (1 + logvar).sum(-1)).sum().item()
When I derive it, it gives me neg_entropy += (-0.5 * nz * math.log(2 * math.pi)- 0.5 * (logvar).sum(-1)).sum().item()
So I think I must have made a mistake somewhere? Thank you very much