training_policies icon indicating copy to clipboard operation
training_policies copied to clipboard

HP Borrowing and Scaling

Open bitfort opened this issue 5 years ago • 3 comments

How does HP borrowing work with scaling up a submission?

bitfort avatar May 23 '19 18:05 bitfort

SWG Notes:

Question: during parameter borrowing, can a submitter re-submit on a larger scale? If not, can a submitter submit a non-converging model on a large system in the hopes of borrowing HPs?

Proposal:

To resubmit benchmark B with a larger scale X:

  1. You must have submitted some benchmark C at scale Y where Y>=X -- "Prove you can go that big"
  2. You must have submitted B at a technically comparable so that the only modification in resubmission is HP borrowing
  3. The scale X must be larger than anything else you submitted for benchmark B
  4. You cannot withdraw the benchmark C at scale Y and it must be compliant or made compliant
  5. All submissions must have "converged" (N-1 convergences using olympic scoring)

AI: Review this.

bitfort avatar May 23 '19 18:05 bitfort

Will revisit for v0.8

bitfort avatar Jun 11 '20 16:06 bitfort

@bitfort @petermattson If we are deferring to 0.8, then what did we decide to do for 0.7? I am unclear on the proposed rule or process.

jonathan-cohen-nvidia avatar Jun 17 '20 22:06 jonathan-cohen-nvidia