nanoGPT icon indicating copy to clipboard operation
nanoGPT copied to clipboard

Early stopping

Open derekehyatt opened this issue 11 months ago • 2 comments

Stop early if the validation loss does not improve in early_stopping_iters (default: off).

derekehyatt avatar Mar 09 '24 17:03 derekehyatt

@karpathy - would love a review for this simple early stopping flag. Thanks!

derekehyatt avatar Mar 09 '24 17:03 derekehyatt

Observations: Correct Implementation: The implementation looks correct in terms of functionality. It allows the training to stop early when there is no improvement in validation loss, saving computational resources. Patience Logic: The patience logic is clear and straightforward. It resets patience to 0 when there’s an improvement in validation loss and increments it otherwise. Mutual Exclusivity Check: The script correctly checks that early_stopping_iters and always_save_checkpoint are not used together. Possible Improvements: Log More Information on Early Stopping: When early stopping is triggered, it might be helpful to log not just the number of patience iterations but also the final best validation loss and step number.

dimentox avatar Sep 07 '24 19:09 dimentox