pytorch-ts icon indicating copy to clipboard operation
pytorch-ts copied to clipboard

nan values for the flow module, how to avoid, or mask out properly?

Open vamp-ire-tap opened this issue 3 years ago • 3 comments

Hello,

thank you for making the code available, i have a problem with the following lines of code (https://github.com/zalandoresearch/pytorch-ts/blob/master/pts/modules/flows.py#L339):

    def log_prob(self, x, cond):
        u, sum_log_abs_det_jacobians = self.forward(x, cond)
        return torch.sum(self.base_dist.log_prob(u) + sum_log_abs_det_jacobians, dim=-1)

basically, the tensor u contains on very odd occasions: nan values, for which the base_dist.log_prob call fails. how can i resolve this issue according to best practices?

thank you

vamp-ire-tap avatar Jan 10 '22 11:01 vamp-ire-tap

right so since the network itself in the coupling layers are simple linear layers, I suspect you get nans when there are nans in your input data... the x or potentially the cond... can you check if that is not the case?

kashif avatar Jan 10 '22 11:01 kashif

thanks a lot for the swift response, the x and cond variables do not seem to have nans:

(Pdb) !np.argwhere(np.isnan(x.cpu().detach().numpy()))
array([], shape=(0, 3), dtype=int64)
(Pdb) !np.argwhere(np.isnan(cond.cpu().detach().numpy()))
array([], shape=(0, 3), dtype=int64)
(Pdb) 

however, the u tensor has several:

(Pdb) !np.argwhere(np.isnan(u.cpu().detach().numpy())).shape
(3700, 3)

vamp-ire-tap avatar Jan 10 '22 11:01 vamp-ire-tap

Hello,

Having the same type of problem (nans for u) I was wondering if you had found a solution.

thank you !

pdenailly avatar Jul 20 '22 16:07 pdenailly