GraphSAGE icon indicating copy to clipboard operation
GraphSAGE copied to clipboard

Can you help me

Open zhanking321 opened this issue 3 years ago • 7 comments

Excuse me, I would like to ask why the positive samples does not get similar embeddings after training?

zhanking321 avatar May 21 '21 08:05 zhanking321

me too !!! I used unsupervised method and it not work

ninesky110 avatar Jun 22 '21 10:06 ninesky110

me too !!! I used unsupervised method and it not work

yes, so I am confused

zhanking321 avatar Jun 22 '21 11:06 zhanking321

Have you ever tried to adjust parameters or replace unsupervised tasks with supervised tasks?

ninesky110 avatar Jun 22 '21 11:06 ninesky110

I replace unsupervised tasks with supervised tasks and find that it worked

ninesky110 avatar Jun 23 '21 08:06 ninesky110

The negative sampling is not used in the supervised case. @ninesky110 will you elaborate on how it did not work? The embeddings learned during unsupervised should be different as things are adjusted, random walks, number negative samples etc.. and to perform inference you will then need to use that embedding for something like Linear Regression or an SVM. @zhanking321 For positive sampling do you mean nodes that are sampled and considered to be of the positive class, or do you mean the number of negative samples? Also, how are you determining whether the embeddings are different between different training sessions?

sam-lev avatar Jun 23 '21 20:06 sam-lev

The negative sampling is not used in the supervised case. @ninesky110 will you elaborate on how it did not work? The embeddings learned during unsupervised should be different as things are adjusted, random walks, number negative samples etc.. and to perform inference you will then need to use that embedding for something like Linear Regression or an SVM. @zhanking321 For positive sampling do you mean nodes that are sampled and considered to be of the positive class, or do you mean the number of negative samples? Also, how are you determining whether the embeddings are different between different training sessions?

In supervised tasks, there is actually no special sampling for positive samples and negative samples. It only randomly samples some nodes from the graph in each batch. Then it calculates the sampled nodes' embedding through aggregation and concate method, and predicts their category. And last the predicted category and the real category results are sent to the sigmoid_cross_entropy_with_logits function.

ninesky110 avatar Jun 24 '21 01:06 ninesky110

right, I agree, so maybe I'm not understanding the question. My point was that if sampling is what you would like to know more about (e.g. the difference in embedding) it wouldn't be informative to compare the supervised and unsupervised models. So for @zhanking321 I was wondering how he is comparing embeddings in the unsupervised model, and noting that switching to supervised wouldn't explain how the embeddings were learned in unsupervised sampling and with respect to your question @ninesky110, I was wondering what issues you were having with unsupervised

sam-lev avatar Jun 28 '21 20:06 sam-lev