pytorch-lightning icon indicating copy to clipboard operation
pytorch-lightning copied to clipboard

Add "interval": "validation" to scheduler configuration

Open de-gozaru opened this issue 3 years ago • 3 comments

🚀 Feature

This request concerns the optimizer Scheduler.

For now, the scheduler can be configured like this:

lr_scheduler_config = {
    "scheduler": lr_scheduler,
    "interval": "epoch",  # epoch, step
    "frequency": 1,
    "monitor": "val_loss",
    "strict": True,
    "name": None,
}

It would be great to add the option "interval": "validation".

Motivation

My use case is the following. I want to .step() the scheduler after each evaluation step, using an evaluation metric and ReduceLROnPlateau scheduler.

Sometimes (1) I'm using a bigger dataset, so in the case I use val_check_interval=0.1. and Other times (2) I'm using smaller dataset, so I use check_val_every_n_epoch=5.

The current interval options are not working for me. step will call the scheduler each step, and epoch will do it after each epoch, however, for my case (1), the scheduler will be called once every 10 validations, and in (2), the scheduler will be called before the validation, and I will get the error pytorch_lightning.utilities.exceptions.MisconfigurationException: ReduceLROnPlateau conditioned on metric my_quantity which is not available. Available metrics are: ['train/loss']. Condition can be set using 'monitor' key in lr scheduler dict.

I hope this is clear for you.

cc @borda @tchaton @rohitgr7

de-gozaru avatar Dec 22 '21 10:12 de-gozaru