Translation fails when from mbart50 fine-tuned preloaded model
🐛 Bug
I have used mbart50 pre-trained model to fine-tune a translation task for Si-En Languages. I now need to preload the model for translations. However during translation I get the following error
Traceback (most recent call last):
File "si_en_mbart50ft_checkpoint_preload.py", line 20, in
To Reproduce
Following is my code. Although the checkpoint can be preloaded, the translation fails owing to the above error. It is further noted that, as the fine-tuning as a translation_multi_simple_epoch task results in this error.
According to the example, the fine-tuning had been conducted by setting the task as translation_multi_simple_epoch.
from fairseq.models.bart import BARTModel
BASEDIR = '/userdirs/abc/pretrained_models/mbart50.pretrained' si2en = BARTModel.from_pretrained( '/userdirs/abc/si_en_models/checkpoints', checkpoint_file='checkpoint_93_75000.pt', bpe='sentencepiece', sentencepiece_model=f'{BASEDIR}/sentence.bpe.model', lang_dict=f'{BASEDIR}/ML50_langs.txt', source_lang = "si_LK", target_lang = "en_XX" )
si2en.eval()
sentence_list = ['දුම්කොළ බදු පනත යටතේ අය කරනු ලබන දඩ මුදලින් සියයට 25ක් තෑගි මුදල් ගෙවීමට හිමිකම් ලබයි .'] translation = si2en.sample(sentence_list, beam=5) print(translation) breakpoint()
Environment
- fairseq Version (e.g., 1.0 or master):
- PyTorch Version (e.g., 1.0)
- OS (e.g., Linux):
- How you installed fairseq (
pip, source): - Build command you used (if compiling from source):
- Python version:
- CUDA/cuDNN version:
- GPU models and configuration:
- Any other relevant information:
Additional context
hi @aloka2209 did you find solution to this?