transformers
transformers copied to clipboard
The Document of LongT5 confilcts with and its example code of prefix
System Info
All.
Who can help?
@patrickvonplaten
Reproduction
See https://huggingface.co/docs/transformers/main/en/model_doc/longt5
Expected behavior
In the above document, it said Unlike the T5 model, LongT5 does not use a task prefix. Furthermore, it uses a different pre-training objective inspired by the pre-training of [PegasusForConditionalGeneration].. But in the example code of LongT5ForConditionalGeneration, there is a prefix of summarize: . I am confused about how to use LongT5 in different down tasks. Could you please help? Thanks.