BERT-Dialog-State-Tracking
BERT-Dialog-State-Tracking copied to clipboard
Loss doasn't converge with the latest version of huggingface/transformers.
I've been trying to reproduce your code. Running your original code produces the results on the paper. Changing only these two lines:
tokenizer = BertTokenizer.from_pretrained(bert_model)
bert = BertForSequenceClassification.from_pretrained(bert_model, num_labels=2)
with these from huggingface/transformers
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
bert = BertForSequenceClassification.from_pretrained('bert-base-uncased', num_labels=2)
The Loss doesn't converge and the accuracy is very low. Do you have any idea why this happens?