LaTeX_OCR_PRO icon indicating copy to clipboard operation
LaTeX_OCR_PRO copied to clipboard

configs/training.json doesn't work

Open aspnetcs opened this issue 3 years ago • 6 comments

configs/training.json is useless, changing the learning rate in it does not work at all, the learning rate I have been using is CosineAnnealingLR

aspnetcs avatar Feb 26 '22 10:02 aspnetcs

do you change lr_init? lr_init is the initial learning rate, while CosineAnnealingLR is learning rate scheduler

LinXueyuanStdio avatar Feb 28 '22 02:02 LinXueyuanStdio

yes, LaTeX_OCR_PRO/configs/training.json This file seems to be useless

--

At 2022-02-28 10:40:42, "兮尘" @.***> wrote:

do you change lr_init? lr_init is the initial learning rate, while CosineAnnealingLR is learning rate scheduler

— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you authored the thread.Message ID: @.***>

aspnetcs avatar Mar 01 '22 02:03 aspnetcs

please refer to model/utils/lr_schedule.py, which defines object LRSchedule. Warming-up (lr_warm, end_warm) and decay (start_decay, end_decay) are taken to schedule the learning rate. the learning rate will be lr_init only when the epoch has end_warm < epoch < start_decay

LinXueyuanStdio avatar Mar 01 '22 07:03 LinXueyuanStdio

I feel particularly complicated, how to use pytorch's lr_scheduler.MultiplicativeLR for training? thanks!!!

--

At 2022-03-01 15:31:00, "兮尘" @.***> wrote:

please refer to model/utils/lr_schedule.py, which defines object LRSchedule. Warming-up (lr_warm, end_warm) and decay (start_decay, end_decay) are taken to schedule the learning rate. the learning rate will be lr_init only when the epoch has end_warm < epoch < start_decay

— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you authored the thread.Message ID: @.***>

aspnetcs avatar Mar 02 '22 09:03 aspnetcs

you can refer to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.MultiplicativeLR.html for pytorch's MultiplicativeLR.

LinXueyuanStdio avatar Mar 02 '22 09:03 LinXueyuanStdio

emmm... it seems you are trying to reproduce with pytorch? lr_scheduler is only a trick to improve performance. you can try any LR on your own. Or you can even give up using LR and use a fixed learning rate.

LinXueyuanStdio avatar Mar 02 '22 09:03 LinXueyuanStdio