Jiaaochen
Jiaaochen
Thanks for the note, we will push the updated version.
I used a pretty old version of transformers, which is pytorch_transformers. I think that should be the initial version of the current transformers package.
This might be related to the dictionary file about the back-translations. This might be because several sentences are not back-translated, like the 9226th sentence.
Thanks for the suggestions!
Is it possible to lora finetune 65B model on 8 V100(32GB)?