uda
uda copied to clipboard
Is there anything to pay attention to during the back translation process
Is there a strict procedure for back translation? I used fairseq's en<=>de transformer pre-training model to get back translation data for training uda, but I can’t get a good result. Using your prepared back translation can get a good result.