Apoorv Saxena
Apoorv Saxena
Hi LinxiCai, thanks for your interest. We studied MetaQA for the QA task, not KG completion task. We want to pretrain on the whole KG (or 50% KG depending on...
Yes On Wed, Nov 16, 2022 at 8:36 PM lemon0996 ***@***.***> wrote: > Hello sir. As for TKGE models, can i use other models. Such as TA-TransE. > > —...
Hi Zhen, thanks for the response > Since CronQuestions dataset has gold topic entities, I think the NERD step can be removed from the EXAQT pipeline. Yes we can probably...
We have posted aliases for WikiKG90Mv2 as well in the readme. For other datasets, we unfortunately do not have records currently
Hi, thanks for your interest! Yes you could use the model but you would have to train it again. Currently only English/Latin characters were used in the pretraining. You would...
You could try the following: 1. Convert your KG into verbalized format. This means that for each triple e.g. (obama, president of, USA) in train KG, make 2 lines as...
> How could I know the details of your pretrained model? What specific details are you looking for that are not there in the paper or on https://huggingface.co/apoorvumang/kgt5-base-wikikg90mv2 ?
You would also need to change the tokenizer. The default tokenizer of T5 might not be good enough for chinese (I'm not sure though).
Hi Ankush, thanks for your interest. We will be adding the QA finetuning clean code soon. However, if you want to do it yourself earlier (and are ok with modifying...
Could you please tell me, roughly, 1. How many triples in your KG? 2. How many QA pairs in your dataset? Based on this I can suggest you the best...