unif
unif copied to clipboard
task about UNILM model
Nice job! Is there any plan about unilm model?
Not yet, but I knew the model and the idea was just great. What about before this weekend?
Not yet, but I knew the model and the idea was just great. What about before this weekend?
I'm looking forward to it.
@wangbq18 Thanks for your suggestion. UNIF now supports UniLM (≥ beta v2.4.9).
Here listed the foundation details of UniLM.
model = uf.UniLM(config_file='demo/bert_config.json', vocab_file='demo/vocab.txt', max_seq_length=128, init_checkpoint=None, output_dir=None, gpu_ids=None, drop_pooler=False, do_sample_next_sentence=True, max_predictions_per_seq=20, masked_lm_prob=0.15, short_seq_prob=0.1, do_whole_word_mask=False, mode='bi', do_lower_case=True, truncate_method='LIFO')
Say, if you wish to switch to other modes, like sequence-to-sequence language modeling. Simply by running model.to_mode('s2s')
, you can make it.
The namespace of UniLM is exactly the same as BERT. After pretraining of UniLM, you can fine-tune the model on BERT-series modules, e.g. BERTClassifier, BERTMRC, and etc. Whenever you encounter any problems, please let me know :)
@wangbq18 Thanks for your suggestion. UNIF now supports UniLM (≥ beta v2.4.9).
Here listed the foundation details of UniLM.
model = uf.UniLM(config_file='demo/bert_config.json', vocab_file='demo/vocab.txt', max_seq_length=128, init_checkpoint=None, output_dir=None, gpu_ids=None, drop_pooler=False, do_sample_next_sentence=True, max_predictions_per_seq=20, masked_lm_prob=0.15, short_seq_prob=0.1, do_whole_word_mask=False, mode='s2s', do_lower_case=True, truncate_method='LIFO')
Say, if you wish to switch to other modes, like sequence-to-sequence language modeling. Simply by running
model.to_mode('s2s')
, you can make it.The namespace of UniLM is exactly the same as BERT. After pretraining of UniLM, you can fine-tune the model on BERT-series modules, e.g. BERTClassifier, BERTMRC, and etc. Whenever you encounter any problems, please let me know :)
Thanks for your job!