unif icon indicating copy to clipboard operation
unif copied to clipboard

task about UNILM model

Open wangbq18 opened this issue 4 years ago • 4 comments

Nice job! Is there any plan about unilm model?

wangbq18 avatar Dec 29 '20 02:12 wangbq18

Not yet, but I knew the model and the idea was just great. What about before this weekend?

geyingli avatar Dec 29 '20 06:12 geyingli

Not yet, but I knew the model and the idea was just great. What about before this weekend?

I'm looking forward to it.

wangbq18 avatar Dec 29 '20 07:12 wangbq18

@wangbq18 Thanks for your suggestion. UNIF now supports UniLM (≥ beta v2.4.9).

Here listed the foundation details of UniLM. model = uf.UniLM(config_file='demo/bert_config.json', vocab_file='demo/vocab.txt', max_seq_length=128, init_checkpoint=None, output_dir=None, gpu_ids=None, drop_pooler=False, do_sample_next_sentence=True, max_predictions_per_seq=20, masked_lm_prob=0.15, short_seq_prob=0.1, do_whole_word_mask=False, mode='bi', do_lower_case=True, truncate_method='LIFO')

Say, if you wish to switch to other modes, like sequence-to-sequence language modeling. Simply by running model.to_mode('s2s'), you can make it.

The namespace of UniLM is exactly the same as BERT. After pretraining of UniLM, you can fine-tune the model on BERT-series modules, e.g. BERTClassifier, BERTMRC, and etc. Whenever you encounter any problems, please let me know :)

geyingli avatar Dec 31 '20 07:12 geyingli

@wangbq18 Thanks for your suggestion. UNIF now supports UniLM (≥ beta v2.4.9).

Here listed the foundation details of UniLM. model = uf.UniLM(config_file='demo/bert_config.json', vocab_file='demo/vocab.txt', max_seq_length=128, init_checkpoint=None, output_dir=None, gpu_ids=None, drop_pooler=False, do_sample_next_sentence=True, max_predictions_per_seq=20, masked_lm_prob=0.15, short_seq_prob=0.1, do_whole_word_mask=False, mode='s2s', do_lower_case=True, truncate_method='LIFO')

Say, if you wish to switch to other modes, like sequence-to-sequence language modeling. Simply by running model.to_mode('s2s'), you can make it.

The namespace of UniLM is exactly the same as BERT. After pretraining of UniLM, you can fine-tune the model on BERT-series modules, e.g. BERTClassifier, BERTMRC, and etc. Whenever you encounter any problems, please let me know :)

Thanks for your job!

wangbq18 avatar Dec 31 '20 08:12 wangbq18