fairseq
fairseq copied to clipboard
mBART50 Translation/Fine Tuning with Many-to-One Model not working
Hello, I think there is something wrong with the arch. If I get the args for the mbart50.ft.n1 model it says "arch": "denoising_large". But denoising_large is not available in Fairseq as I see.
To Reproduce:
path_2_data=/home/sergej/fairseq/data4translation/ model=/home/sergej/fairseq/mbart50.ft.n1/model.pt lang_list=mbart50.ft.n1/ML50_langs.txt source_lang=de_DE lang_pairs=de_DE-en_XX target_lang=en_XX
fairseq-generate $path_2_data
--path $model
--task translation_multi_simple_epoch
--gen-subset test
--source-lang $source_lang
--target-lang $target_lang
--sacrebleu
--remove-bpe 'sentencepiece'
--batch-size 32
--encoder-langtok "src"
--decoder-langtok
--lang-dict "$lang_list"
--lang-pairs "$lang_pairs" > ${source_lang}_${target_lang}_mBART50FTN1_on_KWS_Test.txt
Error:
Traceback (most recent call last):
File "/home/sergej/nlp/bin/fairseq-generate", line 33, in