fairseq
fairseq copied to clipboard
Finetuning BART model by fariseq cannot generate more than 155 characters and end of the sign "�"
🐛 Bug
To Reproduce
I used the BART model Finetuned by fairseq to conduct a machine transaction task, but the trained model cannot generate more than 155 characters, and the end of the sign "�", why is the output generated by the trained model so strange?
I guess the BART model can only embed 1024 tokens, so more than 1024 tokens cannot generate. A loose guess, Hope the answer please.