Thomas Kipf
Thomas Kipf
Try running it on CPU only or put that tensor explicitly on GPU. On Sun, Jun 2, 2019 at 3:43 AM verds wrote: > I have trained a GCN to...
Both versions result in similar performance but slight differences in classification accuracy are possible. On Mon 2. Jul 2018 at 23:56 wangxiaoyunanne wrote: > Hi, > The normalize function in...
Hard to say, it’s often best to try both variants. On Sun 2. Dec 2018 at 22:30 Defa Zhu wrote: > Hi, > I want to know which version is...
Yes, you could normalize each feature to zero mean, unit standard deviation per column. On Tue 16. Apr 2019 at 06:00 AllenWu18 wrote: > Well, I think in the function...
Yes, log_softmax gives you log probabilities. As far as I remember F.nll_loss takes log probabilities as input. So if you want to use softmax outputs directly, then you’d have to...
Depends on whether your cross entropy loss function takes in logits or probabilities... On Tue 12. Feb 2019 at 22:27 RoiGM wrote: > Or use cross entropy loss...? > >...
PyTorch's `nn.CrossEntropyLoss` actually takes logits (it applies both log softmax + NLL loss). On Wed, Feb 13, 2019 at 1:47 PM RoiGM wrote: > @tkipf I meant this loss: >...
The specific splits are chosen arbitrarily, but consistent in size with the ones we use in the paper. For the specific splits, have a look at the TensorFlow GCN implementation...
Hi @evanfeinberg, thanks! It was a pleasure meeting you as well. I posted a solution for this problem some time ago in this thread (from the gcn repository): https://github.com/tkipf/gcn/issues/4 ....
I don’t have an implementation for this, unfortunately. On Sat 21. Apr 2018 at 12:06 idansc wrote: > @tkipf thanks for the package, it is really > useful. > Will...