BERT4doc-Classification
BERT4doc-Classification copied to clipboard
Code and source for paper ``How to Fine-Tune BERT for Text Classification?``
Sorry to bother you! But it seems to me, the run_classifier_single_layer.py does not save the model, and what should I do to fine tuning the fine tuned model? Thanks!
@xuyige Time taken !python run_pretraining.py \ --input_file=./tmp/tf_examples.tfrecord \ --output_dir=./tmp/pretraining_output \ --do_train=True \ --do_eval=True \ --bert_config_file=./uncased_L-12_H-768_A-12/bert_config.json \ --init_checkpoint=./uncased_L-12_H-768_A-12/bert_model.ckpt \ --train_batch_size=32 \ --max_seq_length=128 \ --max_predictions_per_seq=20 \ --num_train_steps=100000 \ --num_warmup_steps=10000 \ --learning_rate=5e-5 \...
您好,Bert当中的Embedding Layer是在Layer0之前的,他的学习率设置为Layer0乘以权重ξ(0.95)会不会更好一点?