TransferRL icon indicating copy to clipboard operation
TransferRL copied to clipboard

Repeated words as decoded/generated output :(

Open thefirebanks opened this issue 6 years ago • 0 comments

Hello! I was wondering if there was a reason why the final decoded sentence would be just the same word repeated min_dec_steps times? I am training this model on the Quora Question Pair dataset for the purpose of paraphrase generation, and I am getting results as such:

Original sentence is: "what books and study materials i will need for political science and ir for cse optionals ?" Hypothesis is: "and and and and and and and and and and and and and and and and and and and and" Reference is: "which of the study material is useful for the upsc cse optional political science and international relations ?"

I know this was originally intended for summarization but your RLSeq2Seq model worked really well for paraphrasing! And I believe the transfer learning component of this model would be essential for generalization, as the current paraphrase generation models also suffer from bad generalization capabilities. Thank you!

thefirebanks avatar Aug 16 '19 14:08 thefirebanks