algorithmic-efficiency icon indicating copy to clipboard operation
algorithmic-efficiency copied to clipboard

Add function that submissions can call that can change the dropout value

Open priyakasimbeg opened this issue 1 year ago • 0 comments

Our current API has 2 dropout related limitations:

  1. Currently, in the external tuning ruleset we read the dropout value from the hparam config and pass it to the model initialization functions. In the self-tuning ruleset there exist no convenient way to specify the dropout value in the model initialization.
  2. Furthermore, there is no way to change the dropout value during training.

Having a workload function to change the dropout value that submitters can call will remove both of these limitations.

priyakasimbeg avatar Sep 12 '24 19:09 priyakasimbeg