multitask-learning-transformers icon indicating copy to clipboard operation
multitask-learning-transformers copied to clipboard

support for text generation task

Open zhipeng-cai opened this issue 1 year ago • 0 comments

i wonder how to use bert model for multitask learning, i hava two tasks: sequence classification and seq2seq text generation. Can i use a bert model as encoder, and another bert model as decoder for the text generation task? And for the sequence classifiction task, i want to use one sequence classification prediction head, they all share the same bert encoder model. Is that right?

I want to create a MultitaskModel, with an bert encoder, and an encoderdecoder model combining the bert encoder and another bert decoder, and also a sequence classification prediciton head. In the forward() method, if the training batch is from text generation task, then i will call the encoderdecoder model's forward method, or if the traing batch is from sequence classification task, i will call the forward() method from the classification head, is that implement right?

Thank you very much!

zhipeng-cai avatar Mar 23 '23 11:03 zhipeng-cai