CodeXiao
CodeXiao
NO,I am seeing the same error. I also used the same function(tf.clip_by_global_norm),but I found learning rate and function are not the key reasons. when i generate Vocab,i set the size...
from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained("MODEL_NAME") model = BertModel.from_pretrained("MODEL_NAME") 我将模型下载到了本地,MiniRBT-h256-pt文件夹下有三个文件 config.json pytorch_model.bin vocab.txt 我将MODEL_NAME替换成本地模型路径:XXXXX/MiniRBT-h256-pt, config能够正常加载 但是加载model时候报错 Some weights of the model checkpoint at XXXXX/MiniRBT-h256-pt were not used when...