CompareNet_FakeNewsDetection
CompareNet_FakeNewsDetection copied to clipboard
evaluation error
Thank you for sharing your source code. I had an issue when I ran the source code. Could you please tell me what is the problem?
Thank you for sharing your source code. I had an issue when I ran the source code. Could you please tell me what is the problem?
In the normal running procedures, the code (line 93) before this error should get a 4*4 confusion matrix (since there are 4 categories). I have no idea as well why this weird error occurs: it seems that a 3*3 confusion matrix was obtained. Please try to check the setting of the number of categories, the shape of the obtained confusion matrix, and whether pandas automatically sets one row/column as index?
Thank you for sharing your source code. I had an issue when I ran the source code. Could you please tell me what is the problem?
In the normal running procedures, the code (line 93) before this error should get a 44 confusion matrix (since there are 4 categories). I have no idea as well why this weird error occurs: it seems that a 33 confusion matrix was obtained. Please try to check the setting of the number of categories, the shape of the obtained confusion matrix, and whether pandas automatically sets one row/column as index?
It seems that this error was caused by the change of the data link that I rely on. I have found the data in my local machine and uploaded it to the release(raw_data.zip). Please check if the error still occurs. https://github.com/BUPT-GAMMA/CompareNet_FakeNewsDetection/releases/tag/dataset
Thank you for your reply. The error has been solved, but the results are different from the ones in the paper. Is there anything wrong?
Hi, I also have the same issue regarding the result:
Could you explain a little? Thanks
The results I got from running it were also a few points worse than the results in the paper. For SLN:
Accuracy on the out of domain test set SLN: 0.8667
Precision on the out of domain test set SLN macro / micro: 0.8668, 0.8667
Recall on the out of domain test set SLN macro / micro: 0.8667, 0.8667
F1 on the out of domain test set SLN macro / micro: 0.8667, 0.8667
Latex: 86.67 & 86.68 & 86.67 & 86.67