tfyolo
tfyolo copied to clipboard
Learning rate scheduler not updated in train_step
My tensorflow version is 2.1.0. I found that when calling step() of the learning rate scheduler, lr is not updated (the scheduler works fine when tested individually). I guess it has something to do with distributed strategy run process. The problem is fixed if moving the learning rate updating process to the main loop, instead of in training step function.
https://github.com/LongxingTan/Yolov5/blob/88acfd988decc4cc78335cfb6eb50f1975294c1f/yolo/train.py#L122
Hi @Darkhunter9 , Thanks for your carefully check about the code. you found so many bug and kindly give solutions, I will check that. Thanks