CronKGQA
CronKGQA copied to clipboard
pretrained model
Hello, I'd like to ask why I get exactly the same result with distilbert-base-uncased and roberta-base?
Do I need to delete temp.ckpt before the train?
Can you elaborate more on the exact changes you made in code?

which model did you use for the results? eg. EmbedKGQA_complex, model1 etc.
It would be helpful if you could give the exact line numbers you changed + the command you ran. Then I can check the same on my side.