Data-Structures-And-Algorithms-Hacktoberfest18
Data-Structures-And-Algorithms-Hacktoberfest18 copied to clipboard
[Potential NAN bug] Loss may become NAN during training
Hello~
Thank you very much for sharing the code!
I try to use my own data set ( with the same shape as mnist) in code. After some iterations, it is found that the training loss became NAN. After carefully checking the code, I found that the following code may trigger NAN in loss:
In /python/algorithms/Classification Algorithms/MNIST Digit cassification using CNN.py: line 245
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1]))
If y contains 0 (output of softmax ), the result of tf.log(y) is inf because log(0) is illegal . And this may cause the result of loss to become NAN.
It could be fixed by making the following changes:
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y + 1e-8), reduction_indices=[1]))
or
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(tf.clip_by_value(y,1e-8,1.0)), reduction_indices=[1]))
The same problem was also found at line 645:
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y_CNN), reduction_indices=[1]))
Hope to hear from you ~
Thanks in advance! : )