Hong Wang
Hong Wang
Sorry for the late reply. I think the checkpoint is saved during the training time. You may need to train a model first using the command in the document. Please...
Thanks for pointing that out! We think this is caused by overfitting. ‘Not NA acc’ is computed on the training data, while ‘test F1’ is computed on the development data....
Sorry for the late reply. Yes, the BiLSTM is the baseline model in the DocRED paper.
Yes. "rel_exist_bert_cls_sep" branch is used for the first phase, and the master branch is used for RE prediction.
I think you are right. I did this to see if the combination of global context and sentence embedding would help. You should use sent_emb if you want to use...
Thank you for the interest! The SentModel is based on BiLSTM, and the you can find the code in the branch of sentence-level encoder.
If there are multiple mentions, you can average these entity embedding of all mentions. The specific sentence means the sentence that the entity located. Best, Hong On Sat, Apr 11,...