pytorch-lightning
pytorch-lightning copied to clipboard
Add param_group name for BaseFinetuningCallback
Description & Motivation
Add option to pass param_group name when we use unfreeze_and_add_param_group method in finetuning callback for better LR logging. As it said in documentation
optimizer has multiple parameter groups they will be named Adam/pg1, Adam/pg2
so when we add new param_group, it will be shown as adam/pg2
Pitch
Change from:
optimizer.add_param_group({"params": params, "lr": params_lr / denom_lr})
to
optimizer.add_param_group({"params": params, "lr": params_lr / denom_lr, "name": unfreeze_params_name})
Alternatives
No response
Additional context
No response
cc @borda