algorithmic-efficiency
algorithmic-efficiency copied to clipboard
Add function that submissions can call that can change the dropout value
Our current API has 2 dropout related limitations:
- Currently, in the external tuning ruleset we read the dropout value from the hparam config and pass it to the model initialization functions. In the self-tuning ruleset there exist no convenient way to specify the dropout value in the model initialization.
- Furthermore, there is no way to change the dropout value during training.
Having a workload function to change the dropout value that submitters can call will remove both of these limitations.