multitask-learning-transformers icon indicating copy to clipboard operation
multitask-learning-transformers copied to clipboard

A simple recipe for training and inferencing Transformer architecture for Multi-Task Learning on custom datasets. You can find two approaches for achieving this in this repo.

Results 3 multitask-learning-transformers issues
Sort by recently updated
recently updated
newest added

Hi @shahrukhx01, Thank you so much for sharing a nice repo. How can we combine the attention of all task heads for the shared encoder model and multiple prediction head...

Thanks for you work. I have a problem: the training using cpu not gpu.

i wonder how to use bert model for multitask learning, i hava two tasks: sequence classification and seq2seq text generation. Can i use a bert model as encoder, and another...