probability
probability copied to clipboard
Normal Inverse Gaussian and Softplus NaN Gradient
I've been encountering an error training my models with NaNs being introduced from the gradients during training. I think I've narrowed the cause down to be the combination of the normal inverse gaussian and a subsequent softplus bijector. I've tried reproducing with the normal distribution as well, but seem unable to. I haven't seen the issue with a bare distribution since #1778 was fixed, so I suspect this is something else.
A gist of the issue is here. The model creation and fit function are in a loop because despite setting the seeds for TF and NP at the top, there is still some other source of randomness that causes it to only fail sometimes.
Any help would be appreciated.