grad_norm Nan question
{'loss': 0.2051, 'grad_norm': nan, 'learning_rate': 9.774074342405685e-06, 'epoch': 0.53} {'loss': 0.0, 'grad_norm': nan, 'learning_rate': 9.773696745450087e-06, 'epoch': 0.53} {'loss': 0.0, 'grad_norm': nan, 'learning_rate': 9.773318840518906e-06, 'epoch': 0.53} How to adjust the gradient such as Nan loss and 0 when training to epoch=0.53?
{'loss': 0.2051, 'grad_norm': nan, 'learning_rate': 9.774074342405685e-06, 'epoch': 0.53} {'loss': 0.0, 'grad_norm': nan, 'learning_rate': 9.773696745450087e-06, 'epoch': 0.53} {'loss': 0.0, 'grad_norm': nan, 'learning_rate': 9.773318840518906e-06, 'epoch': 0.53} How to adjust the gradient such as Nan loss and 0 when training to epoch=0.53?
Hello, I'm also experiencing the same issue. Could you please tell me how you resolved it?