Low-rank-Multimodal-Fusion icon indicating copy to clipboard operation
Low-rank-Multimodal-Fusion copied to clipboard

when I run "train_mosi.py", there is the below results with "UserWarning: dropout option adds dropout after all but last recurrent layer, so non-zero dropout expects num_layers greater than 1, but got dropout=0.3 and num_layers=1 "num_layers={}".format(dropout, num_layers))"

Open xiajili opened this issue 2 years ago • 1 comments

Epoch 0 complete! Average Training loss: nan Training got into NaN values...

Model initialized Epoch 0 complete! Average Training loss: nan Training got into NaN values...

Model initialized Epoch 0 complete! Average Training loss: nan Training got into NaN values...

xiajili avatar Jul 08 '22 19:07 xiajili

I'm also encountering a similar issue where the variance becomes 0 after passing through batchnorm1d. I've confirmed that this happens because of the presence of a 0 variance. Although I know that epsilon in batchNorm1d is meant to prevent division by 0, I still end up with a variance of 0, resulting in NaN values throughout the batchnorm1d output. I'm curious to know how to resolve this issue. i appreciate you it if you could comment here once you resolve this issue

SeungYeonJeong22 avatar Jun 21 '23 09:06 SeungYeonJeong22