gae-pytorch
gae-pytorch copied to clipboard
pos_weight should be a Tensor?
When I train with Cora dataset, I get the following error in binary_cross_entropy_with_logits
. Shouldn't pos_weight
be a Tensor? Thanks!
Traceback (most recent call last):
File "train.py", line 83, in <module>
gae_for(args)
File "train.py", line 62, in gae_for
norm=norm, pos_weight=pos_weight)
File "/gae-pytorch/gae/optimizer.py", line 7, in loss_function
cost = norm * F.binary_cross_entropy_with_logits(preds, labels, pos_weight=pos_weight)
File "/anaconda3/lib/python3.6/site-packages/torch/nn/functional.py", line 2077, in binary_cross_entropy_with_logits
return torch.binary_cross_entropy_with_logits(input, target, weight, pos_weight, reduction_enum)
TypeError: binary_cross_entropy_with_logits(): argument 'pos_weight' (position 4) must be Tensor, not numpy.float64
I fix it by replacing this:
https://github.com/zfjsail/gae-pytorch/blob/22c9edfd37d7679ebff32e4ad92eb89cd96837da/gae/train.py#L48
to this:
pos_weight = torch.Tensor([float(adj.shape[0] * adj.shape[0] - adj.sum()) / adj.sum()])
It works.
but I found that length of pos_weight is 1.
According documents:
pos_weight (Tensor, optional) – a weight of positive examples. Must be a vector with length equal to the number of classes.
pos_weight 's length should be 2.
Doc says: where c is the class number (c>1 for multi-label binary classification, c=1 for single-label binary classification) That is right.
Doc says: where c is the class number (c>1 for multi-label binary classification, c=1 for single-label binary classification) Hi, I have the same problem. Has your problem been solved?