BERT-Dialog-State-Tracking icon indicating copy to clipboard operation
BERT-Dialog-State-Tracking copied to clipboard

Loss doasn't converge with the latest version of huggingface/transformers.

Open J-Mourad opened this issue 2 years ago • 0 comments

I've been trying to reproduce your code. Running your original code produces the results on the paper. Changing only these two lines:

tokenizer = BertTokenizer.from_pretrained(bert_model)
bert      = BertForSequenceClassification.from_pretrained(bert_model, num_labels=2)

with these from huggingface/transformers

tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
bert      = BertForSequenceClassification.from_pretrained('bert-base-uncased', num_labels=2)

The Loss doesn't converge and the accuracy is very low. Do you have any idea why this happens?

J-Mourad avatar Dec 29 '22 22:12 J-Mourad