Low-rank-Multimodal-Fusion
Low-rank-Multimodal-Fusion copied to clipboard
when I run "train_mosi.py", there is the below results with "UserWarning: dropout option adds dropout after all but last recurrent layer, so non-zero dropout expects num_layers greater than 1, but got dropout=0.3 and num_layers=1 "num_layers={}".format(dropout, num_layers))"
Epoch 0 complete! Average Training loss: nan Training got into NaN values...
Model initialized Epoch 0 complete! Average Training loss: nan Training got into NaN values...
Model initialized Epoch 0 complete! Average Training loss: nan Training got into NaN values...
I'm also encountering a similar issue where the variance becomes 0 after passing through batchnorm1d. I've confirmed that this happens because of the presence of a 0 variance. Although I know that epsilon in batchNorm1d is meant to prevent division by 0, I still end up with a variance of 0, resulting in NaN values throughout the batchnorm1d output. I'm curious to know how to resolve this issue. i appreciate you it if you could comment here once you resolve this issue