FinBERT-QA-notebooks
FinBERT-QA-notebooks copied to clipboard
Cannot reproduce the numbers on the paper with this Notebook
Hey! Thanks for open-sourcing this amazing project!
Just a quick question, I folllowed strictly with this notebook FinBERT_QA.ipynb and cannot reproduce the numbers reported on the paper.
More specificly, with using bert-qa as the starting point, after
config = {'bert_model_name': bert-qa,
'max_seq_len': 512,
'batch_size': 16,
'learning_rate': 3e-6,
'weight_decay': 0.01,
'n_epochs': 3,
'num_warmup_steps': 10000}
The results I had was Epoch 2:
Train Loss: 0.069 | Train Accuracy: 98.39%
Validation Loss: 0.089 | Validation Accuracy: 98.09%
Average nDCG@10 for 333 queries: 0.476
MRR@10 for 333 queries: 0.442
Average Precision@1 for 333 queries: 0.381
Epoch 3:
Train Loss: 0.055 | Train Accuracy: 98.75%
Validation Loss: 0.097 | Validation Accuracy: 98.1%
Average nDCG@10 for 333 queries: 0.471
MRR@10 for 333 queries: 0.427
Average Precision@1 for 333 queries: 0.357
but the reported nDCG@10 should be 0.481.
Any ideas/suggestions? Thank you!