glow-pytorch icon indicating copy to clipboard operation
glow-pytorch copied to clipboard

Infinite values in generated images

Open isharifi opened this issue 4 years ago • 7 comments

For generating new images, I sample z from a zero mean and 0.6 standard deviation normal distribution and feed it to the network with reverse=True argument. But in many images, there are plenty of values greater than 1, even Inf value! How can I handle this issue? What is the problem?

Thanks.

isharifi avatar Sep 19 '20 14:09 isharifi

Sorry, I'm not sure what caused the problem. What does the generated images with abnormal values looks like?

chaiyujin avatar Sep 26 '20 06:09 chaiyujin

Actually, I find the part that value explosion occurs. It happens at module.py at line 53, when it scales the input by torch.exp(logs). The Inf value often happens at layer around 80 during forward pass (reverse=True). Then the generated image with negative inf would be something like it (clamped between [0,1]): image

As a result, in backward pass, the gradient would be inf too. So the training becomes impossible.

isharifi avatar Sep 26 '20 12:09 isharifi

I am getting the similar high values. @isharifi have you been able to resolve the issue?

karasepid avatar Nov 13 '20 19:11 karasepid

I am getting the similar high values. @isharifi have you been able to resolve the issue?

Unfortuntely, the problem has not been resolved. @chaiyujin, Do you have any idea?

isharifi avatar Nov 13 '20 19:11 isharifi

@isharifi Sorry for the delay. I'm busy with my own project. It may be caused by some invalid numerical operation, I think.

chaiyujin avatar Nov 24 '20 06:11 chaiyujin

There may arise a numerical issue of division by zero in https://github.com/chaiyujin/glow-pytorch/blob/487a6b149295f4ec4b36e408f63604c593ff2031/glow/models.py#L93 when there are zero elements in the sigmoid output.

For me the following code snippet triggers the division by zero (running an unconditional generation):

torch.cuda.manual_seed_all(16)
glow = glow.to('cuda')
glow(reverse=True)

Or on cpu:

torch.manual_seed(37)
glow = glow.to('cpu')
glow(reverse=True)

I couldn't reproduce it running conditional generation though. A possible fix would be elementwise adding a small value to the scale before division

tenpercent avatar May 20 '21 00:05 tenpercent

There may arise a numerical issue of division by zero in

https://github.com/chaiyujin/glow-pytorch/blob/487a6b149295f4ec4b36e408f63604c593ff2031/glow/models.py#L93

when there are zero elements in the sigmoid output. For me the following code snippet triggers the division by zero (running an unconditional generation):

torch.cuda.manual_seed_all(16)
glow = glow.to('cuda')
glow(reverse=True)

Or on cpu:

torch.manual_seed(37)
glow = glow.to('cpu')
glow(reverse=True)

I couldn't reproduce it running conditional generation though. A possible fix would be elementwise adding a small value to the scale before division

Thanks. I will check if it solves the problem and let you know the result.

isharifi avatar May 25 '21 16:05 isharifi