BERT-QnA-Squad_2.0_Finetuned_Model icon indicating copy to clipboard operation
BERT-QnA-Squad_2.0_Finetuned_Model copied to clipboard

BERT which stands for Bidirectional Encoder Representations from Transformations is the SOTA in Transfer Learning in NLP.

Results 8 BERT-QnA-Squad_2.0_Finetuned_Model issues
Sort by recently updated
recently updated
newest added

Hi, I tested with your script with downloaded model Bert-on-Squad-V2.0 I asked questions like: When BERT was born? Who created Earth? The answers are nonsensical instead of answering "I can't...

model is in eval mode in test_batch file a server file is added which take a post request in the form of questions and paragraph , return the answer in...

Hello , I have two questions first : What is the different between that task and the task in transformers for QA? Second: While I am try to run run_Squad.py...

Excellent work, thanks for sharing. Any chance your code will be upgraded from Huggingface (HF) pytorch_pretrained_bert to the latest, HF transformers? I'm looking at Test_batch.py, trying to figure-out what areas...

Great work! Please, what was the accuracy or F1 score after fine-tuning? Thanks

GREAT REPO to start 👍 But I have one question: I have fed Narendra Modi data(More thAN 5000 WORDS) from Wikipedia, and now I am asking a question on top...

RuntimeError: Error(s) in loading state_dict for BertForQuestionAnswering: Missing key(s) in state_dict: "bert.embeddings.LayerNorm.gamma", "bert.embeddings.LayerNorm.beta", "bert.encoder.layer.0.attention.output.LayerNorm.gamma", "bert.encoder.layer.0.attention.output.LayerNorm.beta", "bert.encoder.layer.0.output.LayerNorm.gamma", "bert.encoder.layer.0.output.LayerNorm.beta", "bert.encoder.layer.1.attention.output.LayerNorm.gamma", "bert.encoder.layer.1.attention.output.LayerNorm.beta", "bert.encoder.layer.1.output.LayerNorm.gamma", "bert.encoder.layer.1.output.LayerNorm.beta", "bert.encoder.layer.2.attention.output.LayerNorm.gamma", "bert.encoder.layer.2.attention.output.LayerNorm.beta", "bert.encoder.layer.2.output.LayerNorm.gamma", "bert.encoder.layer.2.output.LayerNorm.beta", "bert.encoder.layer.3.attention.output.LayerNorm.gamma", "bert.encoder.layer.3.attention.output.LayerNorm.beta", "bert.encoder.layer.3.output.LayerNorm.gamma", "bert.encoder.layer.3.output.LayerNorm.beta", "bert.encoder.layer.4.attention.output.LayerNorm.gamma",...

if your computer can't get online, your project couldn't work. ![image](https://user-images.githubusercontent.com/24652283/55047026-4c6faf80-507e-11e9-8937-25593bc67dd2.png)