ruikun luo

Results 6 comments of ruikun luo

I have this problem too.

Sorry, i might use the wrong version of coqa_util. I will fixed it recently. rational_mask indicates the "support segment" for answer in paragraph.

Code have been updated.

@hquzhuguofeng Maybe you load the raw model without fine-tuning to do evaluation. You might have already fine-tuned, but didn't load this fine-tuned model when doing evaluation.

@fairy-of-9 It caused by the mismatch of pytorch-pretrained-bert. Huggingface like to change its transformers code, but I didn't recognized it at first time. It's hard to download the same version...

@naseeihity --do_lower_case should be set with bert-base-uncased. I have updated code and you can try it again.