EmbedKGQA icon indicating copy to clipboard operation
EmbedKGQA copied to clipboard

I still get 0.49 in metaQA full KG-hop3.

Open ToneLi opened this issue 4 years ago • 7 comments

I still get 0.49 in metaQA full KG-hop3. can you provided the whole project (KG embedding, command), or other advice? 0.49 is too low, I do not know what's wrong with it? I used your command, your data. I hop you can provided the relevant project again or other advice to help me rise the accuracy.

ToneLi avatar Sep 12 '20 23:09 ToneLi

Can you pls give the exact command that you used again? A screenshot of output would also be helpful.

apoorvumang avatar Oct 12 '20 03:10 apoorvumang

I encountered the same problem. I used the training command of metaQA_full 2-hop that you released on GitHub, but the accuracy rate of the test set was only 0.70. Can you provide me the training commands for metaQA_full 3-hop?

panhaiming avatar Oct 15 '20 12:10 panhaiming

I got a same problem. With the ComplEx embeddings you provided, the best validation score achieved on MetaQA_full is only 0.717879. I have already unfroze the embeddings.

sharon-gao avatar Oct 17 '20 15:10 sharon-gao

I got a same problem. I used the training command on the 3-hop MetaQA data, but the model early stopped when it reached 14 rounds with the accuarcy rate about 0.141376. And it took about 2 day to train. When I set the roberta_model.parameters.requires_grad = False, it can reach 0.599131 accuarcy at 37 epoch in a shorter time. The command is "python RoBERTa/main.py --mode train --relation_dim 200 --hidden_dim 256
--gpu 3 --freeze 0 --batch_size 128 --validate_every 5 --hops 3 --lr 0.0005 --entdrop 0.1 --reldrop 0.2 --scoredrop 0.2
--decay 1.0 --model ComplEx --patience 10 --ls 0.0 --outfile 3hop" and I use the "qa_train_3hop.txt" to train the model. Can you provide me the training log and I want to know how much training time the best model took.

mili6qm avatar Oct 27 '20 01:10 mili6qm

I got a same problem. I used the training command on the 3-hop MetaQA data, but the model early stopped when it reached 14 rounds with the accuarcy rate about 0.141376. And it took about 2 day to train. When I set the roberta_model.parameters.requires_grad = False, it can reach 0.599131 accuarcy at 37 epoch in a shorter time. The command is "python RoBERTa/main.py --mode train --relation_dim 200 --hidden_dim 256 --gpu 3 --freeze 0 --batch_size 128 --validate_every 5 --hops 3 --lr 0.0005 --entdrop 0.1 --reldrop 0.2 --scoredrop 0.2 --decay 1.0 --model ComplEx --patience 10 --ls 0.0 --outfile 3hop" and I use the "qa_train_3hop.txt" to train the model. Can you provide me the training log and I want to know how much training time the best model took.

Please use LSTM for MetaQA datasets

apoorvumang avatar Oct 29 '20 15:10 apoorvumang

Thanks for your reply, I will try it.

------------------ 原始邮件 ------------------ 发件人: "Apoorv Umang"<[email protected]>; 发送时间: 2020年10月29日(星期四) 晚上11:30 收件人: "malllabiisc/EmbedKGQA"<[email protected]>; 抄送: "蔡晓玲"<[email protected]>; "Comment"<[email protected]>; 主题: Re: [malllabiisc/EmbedKGQA] I still get 0.49 in metaQA full KG-hop3. (#31)

I got a same problem. I used the training command on the 3-hop MetaQA data, but the model early stopped when it reached 14 rounds with the accuarcy rate about 0.141376. And it took about 2 day to train. When I set the roberta_model.parameters.requires_grad = False, it can reach 0.599131 accuarcy at 37 epoch in a shorter time. The command is "python RoBERTa/main.py --mode train --relation_dim 200 --hidden_dim 256 --gpu 3 --freeze 0 --batch_size 128 --validate_every 5 --hops 3 --lr 0.0005 --entdrop 0.1 --reldrop 0.2 --scoredrop 0.2 --decay 1.0 --model ComplEx --patience 10 --ls 0.0 --outfile 3hop" and I use the "qa_train_3hop.txt" to train the model. Can you provide me the training log and I want to know how much training time the best model took.

Please use LSTM for MetaQA datasets

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

mili6qm avatar Oct 29 '20 15:10 mili6qm

I got a same problem. I used the training command on the 3-hop MetaQA data, but the model early stopped when it reached 14 rounds with the accuarcy rate about 0.141376. And it took about 2 day to train. When I set the roberta_model.parameters.requires_grad = False, it can reach 0.599131 accuarcy at 37 epoch in a shorter time. The command is "python RoBERTa/main.py --mode train --relation_dim 200 --hidden_dim 256 --gpu 3 --freeze 0 --batch_size 128 --validate_every 5 --hops 3 --lr 0.0005 --entdrop 0.1 --reldrop 0.2 --scoredrop 0.2 --decay 1.0 --model ComplEx --patience 10 --ls 0.0 --outfile 3hop" and I use the "qa_train_3hop.txt" to train the model. Can you provide me the training log and I want to know how much training time the best model took.

Please use LSTM for MetaQA datasets

Hello. I used LSTM for MetaQA 3-hop-full dataset, training on many sets of hyperparameters, but the result can only reached 0.728 for the best hyperparameters. Here is the command for the best result I use: python main_LSTM.py --mode train --relation_dim 200 --hidden_dim 256 --gpu 0 --freeze 0 --batch_size 1024 --validate_every 5 --hops 3 --lr 0.0005 --entdrop 0.1 --reldrop 0.2 --scoredrop 0.2 --decay 1.0 --model ComplEx --patience 12 --ls 0.0 --kg_type full

Ironeie avatar Nov 25 '20 07:11 Ironeie