KBQA
KBQA copied to clipboard
Indexing procedure
Hi! What is the procedure for step 4 'Index entities and predicates into ES'? util/index.py requires a entity-frequency file, is that something we're supposed to create from dbpedia2016-04en.hdt and then feed into it? Thank you.
Hi! What is the procedure for step 4 'Index entities and predicates into ES'? util/index.py requires a entity-frequency file, is that something we're supposed to create from dbpedia2016-04en.hdt and then feed into it? Thank you.
Hi! Have you tried to construct entity-frequency file? Does it work?
Hi! What is the procedure for step 4 'Index entities and predicates into ES'? util/index.py requires a entity-frequency file, is that something we're supposed to create from dbpedia2016-04en.hdt and then feed into it? Thank you.
Hi! Have you tried to construct entity-frequency file? Does it work?
I created it and the indexing worked, sure. However, we're still unclear what kind of preprocessing has been done on the input dataset to get it into the format consumed by the jupyter notebook. Seems to be a single 'train' flag in the json indicating train or test example, but we faced issues such as question ID not found (KeyError). Have you managed to get it working?
Hi! What is the procedure for step 4 'Index entities and predicates into ES'? util/index.py requires a entity-frequency file, is that something we're supposed to create from dbpedia2016-04en.hdt and then feed into it? Thank you.
Hi! Have you tried to construct entity-frequency file? Does it work?
I created it and the indexing worked, sure. However, we're still unclear what kind of preprocessing has been done on the input dataset to get it into the format consumed by the jupyter notebook. Seems to be a single 'train' flag in the json indicating train or test example, but we faced issues such as question ID not found (KeyError). Have you managed to get it working?
Thank you for your notification. You could try "/data/lcquad_clean.json" as input data and modify the jupyter notebook to match it. I guess some steps in the preprocessing are missed.