attention-is-all-you-need-pytorch
attention-is-all-you-need-pytorch copied to clipboard
Now the model depends on specific preprocessing method too much
I used to use an older version of this package to build a sequence to sequence model (not translation), and now want to use the new version to build such a model again. However, now it does not support customized data, and it is very hard to modify the Translator class because everything is tied to the TranslationDataset. It would be nice if it can still be more like a generic Transformer, and make the specific functions optional.
Hi ylmeng,
You are right. I will try to refactor this part. Thanks.
ylmeng [email protected] 於 2020年3月5日 週四 下午5:45 寫道:
I used to use an older version of this package to build a sequence to sequence model (not translation), and now want to use the new version to build such a model again. However, now it does not support customized data, and it is very hard to modify the Translator class because everything is tied to the TranslationDataset. It would be nice if it can still be more like a generic Transformer, and make the specific functions optional.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/jadore801120/attention-is-all-you-need-pytorch/issues/143?email_source=notifications&email_token=AA6HAKSJA5BHFV6NACQILX3RF5YDFA5CNFSM4LCF2J22YY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4ISW5EHQ, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA6HAKW4HAKSNWLTDHTL57DRF5YDFANCNFSM4LCF2J2Q .