pytorch-cifar icon indicating copy to clipboard operation
pytorch-cifar copied to clipboard

Using Scheduler

Open HeekangPark opened this issue 5 years ago • 3 comments
trafficstars

Hi, I'm newbie in deep learning, so please understand even if my question is stupid.

You get the result with changing the learning rate manually from 0.1 to 0.01,0.001 at epoch 150, 250, respectively.

Can you please tell me why you didn't use torch.optim.lr_scheduler.MultiStepLR? I think using scheduler

scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[150, 250], gamma=0.1)

on every epoch(scheduler.step()) would work exactly same as what you've done manually.

HeekangPark avatar Jul 25 '20 20:07 HeekangPark

I also notice that. I think the main difference is that resume from checkpoint will cotinuous training from the best state of model, but scheduler will continuous training with current state, which may not be the best state. However, I don't know it matters or not.

densechen avatar Sep 20 '20 08:09 densechen

@HeekangPark First version of pytorch did not have the learning rate schedulers (or maybe they had but they were not as popular back then). So many of the code bases that came out of publications that originated from those publications back then, still handle the learning rate manually, cause simply people are used to doing it that way (though I would recommend using learning rate schedulers).

@densechen You are correct. But I'm not sure I get the point you are saying exactly and I think that I am missing something. So there is a high chance that what I'm about to say is irrelevant (and if it is, you should just ignore it). I just wanted to note that when you are saving your model (to resume later) you can simply save the learning rate scheduler alongside with the model, so that later on when you want to continue from your checkpoint, the scheduler could also be loaded from the checkpoint and so you would continue with the learning rate you were supposed to use.

In my experience there is no downside to using learning rate schedulers at all and I would absolutely recommend it.

AminJun avatar May 20 '22 17:05 AminJun

您好:    您的邮件我已收到,我会尽快回复。                                                                                       刘洪宇

yolunghiu avatar May 20 '22 17:05 yolunghiu