uncertainty-GNN icon indicating copy to clipboard operation
uncertainty-GNN copied to clipboard

Questions about the accuracy of replicating GKDE on Amazon Computers and Photo datasets

Open Shiy-Li opened this issue 1 year ago • 0 comments

Issue Description: I am in the process of writing a related paper, so I intend to reproduce the results of this paper on some datasets. However, I encountered some difficulties when I tried to reproduce your experiments on the Amazon Photo and Computers datasets. I followed the instructions on your GitHub repository, but I could not obtain the same results as reported in your paper.

Steps Taken So Far & Problems:

  1. In the Kernel_Graph.py file, there seem to be missing codes for the Misclassification task on the Amazon Photo and Computers datasets. I think there should be a function called all_kde_npy to implement the first step on the Photo and Computers datasets. I tried to add this function by following the differences between all_kde and all_kde_ood, and I was able to run the first step (python Kernel_Graph.py) successfully.
  2. In the second step (python Baseline.py -dataset amazon_electronics_computers --model GCN --OOD_detection 0), I noticed that the code executes to roc, pr = Misclassification_npz(outs, FLAGS. dataset, FLAGS.model) reported an error about entropy = - prob * (np.log(prob) / np.log(class_num)). After checking, I found that np.log(prob) produced '-inf' value, so I modified the function "entropy_softmax(pred)" to make prob=softmax(pred)+1e-10 to avoid the above situation, and realized the normal operation of the second step.
  3. In the third step (python S-BGCN- T-K_npy.py-dataset amazon_electronics_computers -- OOD_detection 0), The first epoch encountered errors train_loss = nan and val_loss = nan, which prevented me from getting the final result.

I wonder if you could help me with these issues and provide some guidance on how to reproduce your experiments on the Amazon Photo and Computers datasets. I would really appreciate your assistance and feedback. Thank you very much for your time and attention.

Shiy-Li avatar Nov 17 '23 13:11 Shiy-Li