castor icon indicating copy to clipboard operation
castor copied to clipboard

Reduce code redundancy

Open daemon opened this issue 6 years ago • 1 comments

A few issues we should address:

  1. Redundant argument parsers throughout the codebase. I think moving to a hierarchical JSON config system makes sense, where we have a single global configuration for all the universal settings, like the learning rate, weight decay, and optimizer, and then inheriting, model-specific configurations. Does that seem like a sane choice?

  2. Redundant LSTM baseline and regularization models. Is there a reason we need two models (both named LSTMBaseline)? Isn't LSTM-reg just LSTM-baseline with regularization?

  3. Other obviously redundant code [1] [2].

daemon avatar Jan 26 '19 21:01 daemon

For the second point, no we wouldn't need two models. I did something similar for regularizing KimCNN and it's fine if we just have one model with optional regularization parameters.

achyudh avatar Jan 27 '19 01:01 achyudh