texar-pytorch icon indicating copy to clipboard operation
texar-pytorch copied to clipboard

Reduce code redundancy in Pre-trained modules and the corresponding unittests

Open gpengzhi opened this issue 6 years ago • 0 comments

"What worries me is that our implementation of modules based on pre-trained stuff is a bit too repetitive, so that a seemingly small change would require modifying a bunch of files. This is also true for a lot of tests (not limited to pre-trained ones). Let's keep this in mind so we can improve this in the future."

Originally posted by @huzecong in #220

For the code redundancy in the models, I feel that we can extract some task-specific headers for all or most of the pre-trained models. For the testing issue, some of the tests can be shared across different models. I think we can write a common test script that can be used by a group of classes for unit testing.

gpengzhi avatar Oct 01 '19 19:10 gpengzhi