glow-pytorch
glow-pytorch copied to clipboard
Infinite values in generated images
For generating new images, I sample z from a zero mean and 0.6 standard deviation normal distribution and feed it to the network with reverse=True
argument.
But in many images, there are plenty of values greater than 1, even Inf value!
How can I handle this issue? What is the problem?
Thanks.
Sorry, I'm not sure what caused the problem. What does the generated images with abnormal values looks like?
Actually, I find the part that value explosion occurs.
It happens at module.py at line 53, when it scales the input by torch.exp(logs).
The Inf value often happens at layer around 80 during forward pass (reverse=True).
Then the generated image with negative inf would be something like it (clamped between [0,1]):
As a result, in backward pass, the gradient would be inf too. So the training becomes impossible.
I am getting the similar high values. @isharifi have you been able to resolve the issue?
I am getting the similar high values. @isharifi have you been able to resolve the issue?
Unfortuntely, the problem has not been resolved. @chaiyujin, Do you have any idea?
@isharifi Sorry for the delay. I'm busy with my own project. It may be caused by some invalid numerical operation, I think.
There may arise a numerical issue of division by zero in https://github.com/chaiyujin/glow-pytorch/blob/487a6b149295f4ec4b36e408f63604c593ff2031/glow/models.py#L93 when there are zero elements in the sigmoid output.
For me the following code snippet triggers the division by zero (running an unconditional generation):
torch.cuda.manual_seed_all(16)
glow = glow.to('cuda')
glow(reverse=True)
Or on cpu:
torch.manual_seed(37)
glow = glow.to('cpu')
glow(reverse=True)
I couldn't reproduce it running conditional generation though.
A possible fix would be elementwise adding a small value to the scale
before division
There may arise a numerical issue of division by zero in
https://github.com/chaiyujin/glow-pytorch/blob/487a6b149295f4ec4b36e408f63604c593ff2031/glow/models.py#L93
when there are zero elements in the sigmoid output. For me the following code snippet triggers the division by zero (running an unconditional generation):
torch.cuda.manual_seed_all(16) glow = glow.to('cuda') glow(reverse=True)
Or on cpu:
torch.manual_seed(37) glow = glow.to('cpu') glow(reverse=True)
I couldn't reproduce it running conditional generation though. A possible fix would be elementwise adding a small value to the
scale
before division
Thanks. I will check if it solves the problem and let you know the result.