pytorch-lightning icon indicating copy to clipboard operation
pytorch-lightning copied to clipboard

Add param_group name for BaseFinetuningCallback

Open Jserax opened this issue 1 year ago • 0 comments

Description & Motivation

Add option to pass param_group name when we use unfreeze_and_add_param_group method in finetuning callback for better LR logging. As it said in documentation

optimizer has multiple parameter groups they will be named Adam/pg1, Adam/pg2

so when we add new param_group, it will be shown as adam/pg2

Pitch

Change from: optimizer.add_param_group({"params": params, "lr": params_lr / denom_lr}) to optimizer.add_param_group({"params": params, "lr": params_lr / denom_lr, "name": unfreeze_params_name})

Alternatives

No response

Additional context

No response

cc @borda

Jserax avatar Aug 13 '24 10:08 Jserax