cnn-text-classification-tf icon indicating copy to clipboard operation
cnn-text-classification-tf copied to clipboard

Output from the scores not right and also how to apply embeddings during testing

Open DeepInEvil opened this issue 7 years ago • 1 comments

Hello Denny,

Thanks for the great code. But, I have two issues/questions for the same. Firstly, in the text_cnn.py, you are taking the predictions from the matrix multiplication layer as:

self.scores = tf.nn.xw_plus_b(self.h_drop, W, b, name="scores") self.predictions = tf.argmax(self.scores, 1, name="predictions")

But, it should be taken from a softmax layer, though it won't matter much because the highest value in this layer would get a higher probability, but using this code for other projects made me realize that this implementation can be fixed, a little bit.

Secondly, for the words that are not in the training vocabulary, is it possible to feed their vectors for getting the predictions only. n.b. I am not updating the vectors during training.

Thanks in advance, Regards, Debanjan

DeepInEvil avatar May 02 '17 12:05 DeepInEvil

I would use softmax if I want to get also the probabilities such as:

self.scores = tf.nn.xw_plus_b(self.h_drop, W, b, name="scores") self.probabilities = tf.nn.softmax(self.scores) self.predictions = tf.argmax(self.scores, 1, name="predictions")

But I don't need to use softmax if I just want to get the prediction, since the result of tf.argmax(self.scores, 1, name="predictions") would be the same as tf.argmax(self.probabilities, 1, name="predictions") if self.probabilities == tf.nn.softmax(self.scores). softmax will just normalize the scores.

Regards, Cahya

cahya-wirawan avatar May 26 '17 18:05 cahya-wirawan