FinBERT
FinBERT copied to clipboard
Precise training data of Combo
In the readme:
Combo Pre-trained continued from original BERT on 2017, 2018, 2019 SEC 10K dataset
but in the paper:
train a Combo Model on top of the last checkpoint of BERT-Base Uncased. This training was done in parallel with FinBERT Prime, using SEC2019 for the first 250,000 and using SEC1999 for the last 250,000.
What is correct?