seq2seq
seq2seq copied to clipboard
InvalidArgumentError, Found Inf or NaN gradient(global norm).
NO,I am seeing the same error. I also used the same function(tf.clip_by_global_norm),but I found learning rate and function are not the key reasons. when i generate Vocab,i set the size is 4682,and the vocab_size is 4682 in train.py,too. as the same,i do not know whether decrease the batch size is useful. I look at an answer that it might be related to the vanishing/exploding gradient? I do not have any methods.
In train.py
the error is
and the error is found here
i found a questionable point is GPU:0. I think it may be related with my GPU,so i try to add the code like this
i do not know whether it is useful,i want to try.
Have you solved the mistake