keras_lr_finder icon indicating copy to clipboard operation
keras_lr_finder copied to clipboard

Stop when we reach end_lr even if the loss did not diverge

Open Vermeille opened this issue 6 years ago • 2 comments

Stop when we reach end_lr even if the loss did not diverge. In some cases, the loss will not increase and diverge, so make sure we stop at end_lr (as the user asked anyway)

Vermeille avatar Jun 09 '18 15:06 Vermeille

Thanks for the pull request!

Why does this situation happen? The finder uses start_lr and end_lr to calculate the rate of lr increase:

        num_batches = epochs * x_train.shape[0] / batch_size
        self.lr_mult = (end_lr / start_lr) ** (1 / num_batches)
        lr *= self.lr_mult

So, after going through all epochs, we should end up roughly with end_lr. If it doesn't happen, it might be better to fix calculation of lr_mult.

surmenok avatar Jun 13 '18 17:06 surmenok

HA!

You're 100% correct. I was a bit too negligent with that one. I actually saw a very similar piece of code on another repo (like, almost line-by-line similar), and that one did not call fit() itself, hence not calculating correctly the good amount of batches etc.

So, when I saw yours, I immediately thought you'd have the same issue, but you actually don't :)

Sorry for the inconvenience!

Vermeille avatar Jun 13 '18 19:06 Vermeille