pygcn
pygcn copied to clipboard
F.log_softmax(x, dim=1) output is not probability?
Hi,
calling output = model(features, adj) does not give probability output? if I want model to return probability, what should I change? If I change log_softmax to softmax, the loss function F.nll_loss should be changed? thanks.
Yes, log_softmax gives you log probabilities. As far as I remember F.nll_loss takes log probabilities as input. So if you want to use softmax outputs directly, then you’d have to adapt the loss function. On Fri 31. Aug 2018 at 12:54 Xiaoyong Pan [email protected] wrote:
Hi,
calling output = model(features, adj) does not give probability output? if I want model to return probability, what should I change? If I change log_softmax to softmax, the loss function F.nll_loss should be changed? thanks.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/tkipf/pygcn/issues/16, or mute the thread https://github.com/notifications/unsubscribe-auth/AHAcYOBuZ0P-ilVr5H3LyYOdQP-zMnCSks5uWSQGgaJpZM4WVFm7 .
Or use cross entropy loss...?
Depends on whether your cross entropy loss function takes in logits or probabilities...
On Tue 12. Feb 2019 at 22:27 RoiGM [email protected] wrote:
Or use cross entropy loss...?
— You are receiving this because you commented.
Reply to this email directly, view it on GitHub https://github.com/tkipf/pygcn/issues/16#issuecomment-462942176, or mute the thread https://github.com/notifications/unsubscribe-auth/AHAcYBjh3OQBehhbWH4JFJyY5btY70Hdks5vMzG3gaJpZM4WVFm7 .
@tkipf I meant this loss: https://pytorch.org/docs/0.3.1/nn.html#torch.nn.CrossEntropyLoss AFAIK it takes probabilities.
PyTorch's nn.CrossEntropyLoss
actually takes logits (it applies both log
softmax + NLL loss).
On Wed, Feb 13, 2019 at 1:47 PM RoiGM [email protected] wrote:
@tkipf https://github.com/tkipf I meant this loss: https://pytorch.org/docs/0.3.1/nn.html#torch.nn.CrossEntropyLoss AFAIK it takes probabilities.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/tkipf/pygcn/issues/16#issuecomment-463186774, or mute the thread https://github.com/notifications/unsubscribe-auth/AHAcYDFIdlMLOcZUnBCOIVMz9rWWx71dks5vNAl7gaJpZM4WVFm7 .
Log_softmax
applies log to the Softmax
. In pytorch torch.log
is by default the natural log.
you can think of it like this:
below is what you are getting, y_ hat is the result of the softmax
and output is the result of Log_softmax
With respect to laws of Logarithms.
output = torch.exp(model(features, adj))
taking torch.exp
will give you results as probabilities.