scibert icon indicating copy to clipboard operation
scibert copied to clipboard

Unused Weights When Loading Model

Open AshOlogn opened this issue 3 years ago • 0 comments

I am using the newest version of Transformers (i.e. version 3.0) and try to load a Scibert model using the following code

`scibert_tokenizer = AutoTokenizer.from_pretrained('allenai/scibert_scivocab_uncased')

scibert_model = BertForMaskedLM.from_pretrained('allenai/scibert_scivocab_uncased') `

When I do this I get the following message indicating unused weights:

`Some weights of the model checkpoint at allenai/scibert_scivocab_uncased were not used when initializing BertForMaskedLM: ['cls.seq_relationship.weight', 'cls.seq_relationship.bias']

This IS expected if you are initializing BertForMaskedLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPretraining model). This IS NOT expected if you are initializing BertForMaskedLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).`

I do not get a similar error when using an earlier version of Transformers (i.e. 2.something).

AshOlogn avatar Oct 31 '20 23:10 AshOlogn