libai icon indicating copy to clipboard operation
libai copied to clipboard

fix lr calculation in get_default_optimizer_params

Open HiHippie opened this issue 1 year ago • 5 comments

该pr完善了libai/optim/build.py/get_default_optimizer_params中自定义不同层lr不方便的问题。

HiHippie avatar Aug 08 '22 03:08 HiHippie

这个有没有自己本地实验 打印一下optim下module的lr 看看是否奏效呀?

CPFLAME avatar Aug 08 '22 03:08 CPFLAME

这个有没有自己本地实验 打印一下optim下module的lr 看看是否奏效呀?

我下午测一下~

HiHippie avatar Aug 08 '22 05:08 HiHippie

测了下,warmup_method不是linear的话没问题。

optimizer.state_dict()["param_groups"]
[{'_options': {'lr': 9.375e-09, 'eps': 1e-08, 'betas': [0.9, 0.95], 'weight_decay': 0.05, 'bias_correction1': 1.0, 'bias_correction2': 1.0, 'do_bias_correction': True, 'amsgrad': False, 'initial_lr': 9.375e-06}, '_enable_clip_grad': False, 'params': [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173]}, 
{'_options': {'lr': 9.375e-10, 'eps': 1e-08, 'betas': [0.9, 0.95], 'weight_decay': 0.05, 'bias_correction1': 1.0, 'bias_correction2': 1.0, 'do_bias_correction': True, 'amsgrad': False, 'initial_lr': 9.374999999999999e-07}, '_enable_clip_grad': False, 'params': [174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253]}]

但warmup_method为linear时,初始化scheduler有如下判断。

        if self.warmup_method == "linear":
            if scheduler and self.warmup_prefix is False:
                base_lr = self.base_lrs[0]
                if not np.isclose(self.base_lrs, base_lr).all():
                    raise ValueError(
                        "The param_groups in optimizer have different warmup configs, please use different optimizers."
                    )

假设修改mlp层的lr,此时self.base_lrs为[非mlp层lr,mlp层lr],会过不了if not np.isclose(self.base_lrs, base_lr).all():检查。

HiHippie avatar Aug 08 '22 08:08 HiHippie

@BBuf

这里我不知道我理解的对不对, 在以linearwarmup_method的时候, 需要有且只有一个end_factor 所以它不支持你的param里面存在多个不同的base_lr

如果是上述这样的话, 我建议在libai就加入一个判断, 如果base_lr不一致的话, warmup_method就不能用linear, 只能用constant

https://github.com/Oneflow-Inc/oneflow/blob/b22e7dc32004b032446513252b5372ac7a6dcd1d/python/oneflow/nn/optimizer/warmup_lr.py#L149-L169

        if self.warmup_method == "linear":
            if scheduler and self.warmup_prefix is False:
                base_lr = self.base_lrs[0]
                if not np.isclose(self.base_lrs, base_lr).all():
                    raise ValueError(
                        "The param_groups in optimizer have different warmup configs, please use different optimizers."
                    )

                end_lr = scheduler.get_lr(base_lr, self.warmup_iters)
                end_factor = end_lr / base_lr
            else:
                end_factor = 1.0

            warmup = LinearLR(
                self.optimizer,
                start_factor=self.warmup_factor,
                end_factor=end_factor,
                total_iters=self.warmup_iters,
                last_step=self.last_step,
                verbose=self.verbose,
            )

CPFLAME avatar Aug 09 '22 02:08 CPFLAME

@BBuf

这里我不知道我理解的对不对, 在以linearwarmup_method的时候, 需要有且只有一个end_factor 所以它不支持你的param里面存在多个不同的base_lr

如果是上述这样的话, 我建议在libai就加入一个判断, 如果base_lr不一致的话, warmup_method就不能用linear, 只能用constant

https://github.com/Oneflow-Inc/oneflow/blob/b22e7dc32004b032446513252b5372ac7a6dcd1d/python/oneflow/nn/optimizer/warmup_lr.py#L149-L169

        if self.warmup_method == "linear":
            if scheduler and self.warmup_prefix is False:
                base_lr = self.base_lrs[0]
                if not np.isclose(self.base_lrs, base_lr).all():
                    raise ValueError(
                        "The param_groups in optimizer have different warmup configs, please use different optimizers."
                    )

                end_lr = scheduler.get_lr(base_lr, self.warmup_iters)
                end_factor = end_lr / base_lr
            else:
                end_factor = 1.0

            warmup = LinearLR(
                self.optimizer,
                start_factor=self.warmup_factor,
                end_factor=end_factor,
                total_iters=self.warmup_iters,
                last_step=self.last_step,
                verbose=self.verbose,
            )

我和你的理解一致。

BBuf avatar Aug 09 '22 03:08 BBuf