L-GM-loss icon indicating copy to clipboard operation
L-GM-loss copied to clipboard

Issue about cross entropy loss in tensorflow

Open deepmo24 opened this issue 5 years ago • 0 comments

Hi, The cross entropy loss you used in Tensforflow is 'sparse_softmax_cross_entropy_with_logits()'. however, this loss function will perform a 'softmax' on 'logits'. Since your lgm loss doesn't contain a 'softmax' operation, I wonder whether the loss function is correctly used here?

I am looking forward to hearing from you.

deepmo24 avatar Apr 24 '19 09:04 deepmo24