bert_language_understanding icon indicating copy to clipboard operation
bert_language_understanding copied to clipboard

Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN

Results 9 bert_language_understanding issues
Sort by recently updated
recently updated
newest added

我依次根据readme执行了python train_bert_lm.py和python train_bert_fine_tuning.py,而后运行run_classifier_predict_online.py出错(此时我将开源的chinese-bert-base中的bert_config.json和vocab.txt复制到了微调后保存的模型目录下)。 望大佬悉知,并耐心回复下解决方案,已经尝试了多种方式验证,都报同一个错。 报错信息如下: INFO:tensorflow:Restoring parameters from ./checkpoint_finetuing_law200_bert/model.ckpt-1 2019-10-28 19:41:26.392664: W tensorflow/core/framework/op_kernel.cc:1502] OP_REQUIRES failed at save_restore_v2_ops.cc:184 : Not found: Key bert/embeddings/LayerNorm/beta not found in checkpoint Traceback (most recent call...

modeling.BertConfig modeling.BertModel error

Hello, you said that the model pre-train effect means training on the basis of the Chinese pre-train model provided by Google, adding the data on hand, and then training the...

请问你是没有用bert 中的encoder -decoder 而是自己设计的CNN 卷积进行预训练和微调的吗?为什么呢,是因为原始的encoder- decoder 效果没有CNN 效果好吗

pre-train masked language with BERT: python train_bert_lm.py [DONE] 请问这里的 [DONE] 是什么意思,.py脚本后不需要跟文件路径么?望指点。

1.some parameters are not in config.py , such as self.sequence_length_lm、self.is_fine_tuning 2.self.ckpt_dir is not clear