Zongmeng Zhang
Zongmeng Zhang
> Thanks for your quick reply! It seems like they share the same scores. ``` 194495 80.10902 194495 80.10902 194497 79.05869 194497 79.05869 18163020 76.142555 18163020 76.142555 18163021 75.90108 18163021...
I got the same warning too. And the loss becomes NaN during training. Still not figured it out.
> I got the same warning too. And the loss becomes NaN during training. Still not figured it out. I set the input to be contiguous and then the warning...