KG-BART
KG-BART copied to clipboard
Inputs ignored during generation
I attempted to use the model for a separate task. With few, if any changes to the model, it seems that during generation, the model ignores all inputs to the decoder and predicts the most probable token each time. I would like to know if this is a known issue/if anyone else has encountered such an issue, if it is an issue with the code I wrote, or if this is a new issue that hasn't been seen before.
If anyone has tried working with this code and could give feedback, I would appreciate to hear from you, and thanks in advance.