cnn-text-classification-tf
cnn-text-classification-tf copied to clipboard
Output from the scores not right and also how to apply embeddings during testing
Hello Denny,
Thanks for the great code. But, I have two issues/questions for the same. Firstly, in the text_cnn.py, you are taking the predictions from the matrix multiplication layer as:
self.scores = tf.nn.xw_plus_b(self.h_drop, W, b, name="scores") self.predictions = tf.argmax(self.scores, 1, name="predictions")
But, it should be taken from a softmax layer, though it won't matter much because the highest value in this layer would get a higher probability, but using this code for other projects made me realize that this implementation can be fixed, a little bit.
Secondly, for the words that are not in the training vocabulary, is it possible to feed their vectors for getting the predictions only. n.b. I am not updating the vectors during training.
Thanks in advance, Regards, Debanjan
I would use softmax if I want to get also the probabilities such as:
self.scores = tf.nn.xw_plus_b(self.h_drop, W, b, name="scores") self.probabilities = tf.nn.softmax(self.scores) self.predictions = tf.argmax(self.scores, 1, name="predictions")
But I don't need to use softmax if I just want to get the prediction, since the result of tf.argmax(self.scores, 1, name="predictions") would be the same as tf.argmax(self.probabilities, 1, name="predictions") if self.probabilities == tf.nn.softmax(self.scores). softmax will just normalize the scores.
Regards, Cahya