pygcn icon indicating copy to clipboard operation
pygcn copied to clipboard

F.log_softmax(x, dim=1) output is not probability?

Open xypan1232 opened this issue 6 years ago • 6 comments

Hi,

calling output = model(features, adj) does not give probability output? if I want model to return probability, what should I change? If I change log_softmax to softmax, the loss function F.nll_loss should be changed? thanks.

xypan1232 avatar Aug 31 '18 11:08 xypan1232

Yes, log_softmax gives you log probabilities. As far as I remember F.nll_loss takes log probabilities as input. So if you want to use softmax outputs directly, then you’d have to adapt the loss function. On Fri 31. Aug 2018 at 12:54 Xiaoyong Pan [email protected] wrote:

Hi,

calling output = model(features, adj) does not give probability output? if I want model to return probability, what should I change? If I change log_softmax to softmax, the loss function F.nll_loss should be changed? thanks.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/tkipf/pygcn/issues/16, or mute the thread https://github.com/notifications/unsubscribe-auth/AHAcYOBuZ0P-ilVr5H3LyYOdQP-zMnCSks5uWSQGgaJpZM4WVFm7 .

tkipf avatar Aug 31 '18 12:08 tkipf

Or use cross entropy loss...?

roireshef avatar Feb 12 '19 21:02 roireshef

Depends on whether your cross entropy loss function takes in logits or probabilities...

On Tue 12. Feb 2019 at 22:27 RoiGM [email protected] wrote:

Or use cross entropy loss...?

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/tkipf/pygcn/issues/16#issuecomment-462942176, or mute the thread https://github.com/notifications/unsubscribe-auth/AHAcYBjh3OQBehhbWH4JFJyY5btY70Hdks5vMzG3gaJpZM4WVFm7 .

tkipf avatar Feb 13 '19 10:02 tkipf

@tkipf I meant this loss: https://pytorch.org/docs/0.3.1/nn.html#torch.nn.CrossEntropyLoss AFAIK it takes probabilities.

roireshef avatar Feb 13 '19 12:02 roireshef

PyTorch's nn.CrossEntropyLoss actually takes logits (it applies both log softmax + NLL loss).

On Wed, Feb 13, 2019 at 1:47 PM RoiGM [email protected] wrote:

@tkipf https://github.com/tkipf I meant this loss: https://pytorch.org/docs/0.3.1/nn.html#torch.nn.CrossEntropyLoss AFAIK it takes probabilities.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/tkipf/pygcn/issues/16#issuecomment-463186774, or mute the thread https://github.com/notifications/unsubscribe-auth/AHAcYDFIdlMLOcZUnBCOIVMz9rWWx71dks5vNAl7gaJpZM4WVFm7 .

tkipf avatar Feb 13 '19 16:02 tkipf

Log_softmax applies log to the Softmax. In pytorch torch.log is by default the natural log. you can think of it like this:

below is what you are getting, y_ hat is the result of the softmax and output is the result of Log_softmax With respect to laws of Logarithms. output = torch.exp(model(features, adj)) taking torch.exp will give you results as probabilities.

NinaM31 avatar Apr 25 '21 19:04 NinaM31