Deep-Learning-TensorFlow
Deep-Learning-TensorFlow copied to clipboard
Possible NaN bug in logistic regression
Thanks for offering this great repository!
I found that https://github.com/gabrieleangeletti/Deep-Learning-TensorFlow/blob/ddeb1f2848da7b7bee166ad2152b4afc46bb2086/yadlt/models/linear/logistic_regression.py#L52-L55 may contain a numerical bug when the loss_func is "cross_entropy" since self.mod_y
can be close to zero.
To fix this, maybe we can replace
self.mod_y = tf.nn.softmax(
tf.add(tf.matmul(self.input_data, self.W_), self.b_))
by
self.mod_y = tf.nn.softmax(
tf.clip_by_value(tf.add(tf.matmul(self.input_data, self.W_), self.b_), 1e-10,1.0)
Thanks!